Dec 03 12:13:30 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 12:13:30 crc restorecon[4665]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:30 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 12:13:31 crc restorecon[4665]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 12:13:31 crc kubenswrapper[4666]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:13:31 crc kubenswrapper[4666]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 12:13:31 crc kubenswrapper[4666]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:13:31 crc kubenswrapper[4666]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:13:31 crc kubenswrapper[4666]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 12:13:31 crc kubenswrapper[4666]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.270606 4666 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273884 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273904 4666 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273909 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273914 4666 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273919 4666 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273922 4666 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273927 4666 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273931 4666 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273935 4666 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273941 4666 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273947 4666 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273952 4666 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273956 4666 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273962 4666 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273967 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273972 4666 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273977 4666 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273981 4666 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273985 4666 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273989 4666 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273992 4666 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273996 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.273999 4666 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274003 4666 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274007 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274011 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274015 4666 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274019 4666 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274025 4666 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274029 4666 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274032 4666 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274036 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274040 4666 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274043 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274047 4666 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274051 4666 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274055 4666 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274058 4666 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274062 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274066 4666 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274069 4666 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274073 4666 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274076 4666 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274080 4666 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274099 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274103 4666 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274108 4666 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274111 4666 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274115 4666 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274118 4666 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274122 4666 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274125 4666 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274129 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274132 4666 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274135 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274139 4666 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274142 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274146 4666 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274149 4666 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274153 4666 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274156 4666 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274164 4666 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274169 4666 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274173 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274177 4666 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274180 4666 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274184 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274189 4666 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274193 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274199 4666 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.274203 4666 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274290 4666 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274298 4666 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274306 4666 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274311 4666 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274317 4666 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274322 4666 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274327 4666 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274335 4666 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274339 4666 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274343 4666 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274348 4666 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274352 4666 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274356 4666 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274360 4666 flags.go:64] FLAG: --cgroup-root="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274364 4666 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274369 4666 flags.go:64] FLAG: --client-ca-file="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274373 4666 flags.go:64] FLAG: --cloud-config="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274377 4666 flags.go:64] FLAG: --cloud-provider="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274381 4666 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274387 4666 flags.go:64] FLAG: --cluster-domain="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274390 4666 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274395 4666 flags.go:64] FLAG: --config-dir="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274400 4666 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274404 4666 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274410 4666 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274414 4666 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274419 4666 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274423 4666 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274427 4666 flags.go:64] FLAG: --contention-profiling="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274431 4666 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274435 4666 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274439 4666 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274443 4666 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274448 4666 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274452 4666 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274456 4666 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274460 4666 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274464 4666 flags.go:64] FLAG: --enable-server="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274468 4666 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274474 4666 flags.go:64] FLAG: --event-burst="100" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274481 4666 flags.go:64] FLAG: --event-qps="50" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274486 4666 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274490 4666 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274495 4666 flags.go:64] FLAG: --eviction-hard="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274501 4666 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274506 4666 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274511 4666 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274515 4666 flags.go:64] FLAG: --eviction-soft="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274519 4666 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274524 4666 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274528 4666 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274532 4666 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274537 4666 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274541 4666 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274545 4666 flags.go:64] FLAG: --feature-gates="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274551 4666 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274555 4666 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274559 4666 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274563 4666 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274568 4666 flags.go:64] FLAG: --healthz-port="10248" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274573 4666 flags.go:64] FLAG: --help="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274577 4666 flags.go:64] FLAG: --hostname-override="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274582 4666 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274587 4666 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274592 4666 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274597 4666 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274601 4666 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274606 4666 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274612 4666 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274617 4666 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274621 4666 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274627 4666 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274632 4666 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274636 4666 flags.go:64] FLAG: --kube-reserved="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274640 4666 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274644 4666 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274649 4666 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274652 4666 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274657 4666 flags.go:64] FLAG: --lock-file="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274662 4666 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274666 4666 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274670 4666 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274677 4666 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274681 4666 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274685 4666 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274690 4666 flags.go:64] FLAG: --logging-format="text" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274694 4666 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274699 4666 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274703 4666 flags.go:64] FLAG: --manifest-url="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274707 4666 flags.go:64] FLAG: --manifest-url-header="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274713 4666 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274717 4666 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274722 4666 flags.go:64] FLAG: --max-pods="110" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274727 4666 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274732 4666 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274736 4666 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274741 4666 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274745 4666 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274749 4666 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274753 4666 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274764 4666 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274768 4666 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274773 4666 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274778 4666 flags.go:64] FLAG: --pod-cidr="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274782 4666 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274790 4666 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274794 4666 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274798 4666 flags.go:64] FLAG: --pods-per-core="0" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274802 4666 flags.go:64] FLAG: --port="10250" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274807 4666 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274811 4666 flags.go:64] FLAG: --provider-id="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274815 4666 flags.go:64] FLAG: --qos-reserved="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274820 4666 flags.go:64] FLAG: --read-only-port="10255" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274824 4666 flags.go:64] FLAG: --register-node="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274828 4666 flags.go:64] FLAG: --register-schedulable="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274832 4666 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274839 4666 flags.go:64] FLAG: --registry-burst="10" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274848 4666 flags.go:64] FLAG: --registry-qps="5" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274852 4666 flags.go:64] FLAG: --reserved-cpus="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274856 4666 flags.go:64] FLAG: --reserved-memory="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274861 4666 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274865 4666 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274869 4666 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274873 4666 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274878 4666 flags.go:64] FLAG: --runonce="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274882 4666 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274886 4666 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274890 4666 flags.go:64] FLAG: --seccomp-default="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274894 4666 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274898 4666 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274903 4666 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274907 4666 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274911 4666 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274915 4666 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274919 4666 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274923 4666 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274928 4666 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274932 4666 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274936 4666 flags.go:64] FLAG: --system-cgroups="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274940 4666 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274946 4666 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274950 4666 flags.go:64] FLAG: --tls-cert-file="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274954 4666 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274985 4666 flags.go:64] FLAG: --tls-min-version="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274989 4666 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274993 4666 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.274997 4666 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.275001 4666 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.275006 4666 flags.go:64] FLAG: --v="2" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.275014 4666 flags.go:64] FLAG: --version="false" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.275020 4666 flags.go:64] FLAG: --vmodule="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.275025 4666 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.275029 4666 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275149 4666 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275155 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275159 4666 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275163 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275167 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275171 4666 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275175 4666 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275179 4666 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275183 4666 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275187 4666 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275191 4666 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275194 4666 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275198 4666 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275202 4666 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275205 4666 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275212 4666 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275215 4666 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275219 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275223 4666 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275226 4666 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275231 4666 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275235 4666 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275239 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275243 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275247 4666 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275250 4666 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275254 4666 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275257 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275263 4666 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275266 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275270 4666 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275279 4666 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275284 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275288 4666 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275293 4666 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275297 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275302 4666 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275307 4666 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275312 4666 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275317 4666 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275322 4666 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275327 4666 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275332 4666 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275338 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275343 4666 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275347 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275351 4666 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275357 4666 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275361 4666 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275365 4666 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275369 4666 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275373 4666 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275376 4666 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275380 4666 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275383 4666 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275388 4666 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275391 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275395 4666 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275398 4666 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275402 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275409 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275412 4666 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275416 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275420 4666 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275424 4666 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275429 4666 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275433 4666 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275438 4666 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275442 4666 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275446 4666 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.275450 4666 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.275457 4666 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.287932 4666 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.288022 4666 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288147 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288159 4666 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288165 4666 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288170 4666 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288176 4666 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288182 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288188 4666 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288193 4666 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288199 4666 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288204 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288209 4666 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288214 4666 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288219 4666 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288224 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288231 4666 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288236 4666 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288242 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288248 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288254 4666 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288260 4666 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288266 4666 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288272 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288277 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288283 4666 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288288 4666 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288294 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288299 4666 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288305 4666 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288310 4666 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288317 4666 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288327 4666 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288334 4666 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288339 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288346 4666 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288354 4666 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288359 4666 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288365 4666 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288371 4666 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288376 4666 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288381 4666 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288386 4666 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288391 4666 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288396 4666 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288402 4666 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288407 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288413 4666 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288420 4666 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288427 4666 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288432 4666 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288439 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288444 4666 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288450 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288456 4666 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288461 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288467 4666 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288472 4666 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288478 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288484 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288489 4666 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288496 4666 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288503 4666 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288508 4666 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288514 4666 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288519 4666 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288524 4666 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288531 4666 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288537 4666 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288543 4666 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288550 4666 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288556 4666 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288562 4666 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.288573 4666 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288786 4666 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288797 4666 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288803 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288809 4666 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288814 4666 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288820 4666 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288825 4666 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288830 4666 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288835 4666 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288842 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288847 4666 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288853 4666 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288858 4666 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288866 4666 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288872 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288878 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288883 4666 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288889 4666 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288894 4666 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288899 4666 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288905 4666 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288910 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288915 4666 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288920 4666 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288926 4666 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288931 4666 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288936 4666 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288941 4666 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288948 4666 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288956 4666 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288962 4666 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288969 4666 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288974 4666 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288980 4666 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288987 4666 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.288994 4666 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289002 4666 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289008 4666 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289015 4666 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289021 4666 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289027 4666 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289033 4666 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289038 4666 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289046 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289051 4666 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289057 4666 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289063 4666 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289068 4666 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289073 4666 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289078 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289101 4666 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289106 4666 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289112 4666 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289117 4666 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289123 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289128 4666 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289133 4666 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289138 4666 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289143 4666 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289148 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289154 4666 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289159 4666 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289166 4666 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289172 4666 feature_gate.go:330] unrecognized feature gate: Example Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289178 4666 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289183 4666 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289188 4666 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289194 4666 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289199 4666 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289205 4666 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.289211 4666 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.289221 4666 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.289785 4666 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.293331 4666 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.293473 4666 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.294151 4666 server.go:997] "Starting client certificate rotation" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.294187 4666 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.294335 4666 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-01 22:45:57.434353142 +0000 UTC Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.294403 4666 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 706h32m26.139952523s for next certificate rotation Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.310599 4666 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.312474 4666 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.326932 4666 log.go:25] "Validated CRI v1 runtime API" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.340642 4666 log.go:25] "Validated CRI v1 image API" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.342710 4666 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.345620 4666 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-12-06-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.345665 4666 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:40 fsType:tmpfs blockSize:0}] Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.358925 4666 manager.go:217] Machine: {Timestamp:2025-12-03 12:13:31.357651708 +0000 UTC m=+0.202612779 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:381ea1db-1c63-4f6e-af2b-f374cfb9263c BootID:965dbf1a-276c-4547-879d-6d43a85ca63c Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:40 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:67:05:26 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:67:05:26 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c7:b5:00 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d0:72:2b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:aa:76:80 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8c:43:f2 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:a1:83:75 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5a:ba:35:fa:c4:14 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:6e:9c:d9:d6:2d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.359196 4666 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.359364 4666 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.359897 4666 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.360069 4666 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.360126 4666 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.360351 4666 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.360362 4666 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.360548 4666 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.360584 4666 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.360956 4666 state_mem.go:36] "Initialized new in-memory state store" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.361061 4666 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.361727 4666 kubelet.go:418] "Attempting to sync node with API server" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.361750 4666 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.361793 4666 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.361811 4666 kubelet.go:324] "Adding apiserver pod source" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.361824 4666 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.363847 4666 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.364673 4666 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.365078 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.365195 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.365449 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.365532 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.366037 4666 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367378 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367405 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367413 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367422 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367438 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367448 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367458 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367472 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367488 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367504 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367518 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367530 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.367775 4666 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.368399 4666 server.go:1280] "Started kubelet" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.368508 4666 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.369569 4666 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 12:13:31 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.370886 4666 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.371491 4666 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.371535 4666 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.371183 4666 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.372292 4666 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 00:08:14.210681607 +0000 UTC Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.372482 4666 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 779h54m42.838203912s for next certificate rotation Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.372805 4666 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.373191 4666 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.373225 4666 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.373445 4666 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.373784 4666 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187db384aa014f05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:13:31.368369925 +0000 UTC m=+0.213330976,LastTimestamp:2025-12-03 12:13:31.368369925 +0000 UTC m=+0.213330976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.377354 4666 factory.go:55] Registering systemd factory Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.377405 4666 factory.go:221] Registration of the systemd container factory successfully Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.377410 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.378003 4666 server.go:460] "Adding debug handlers to kubelet server" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.378600 4666 factory.go:153] Registering CRI-O factory Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.378662 4666 factory.go:221] Registration of the crio container factory successfully Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.378567 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.378755 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.378798 4666 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.378862 4666 factory.go:103] Registering Raw factory Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.378910 4666 manager.go:1196] Started watching for new ooms in manager Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.380051 4666 manager.go:319] Starting recovery of all containers Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386132 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386199 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386215 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386231 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386244 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386261 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386277 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386294 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386310 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386322 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386336 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386362 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386375 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386392 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386407 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386421 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386434 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386494 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386506 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386517 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386531 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386543 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386559 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386574 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386607 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386625 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386644 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386659 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386672 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386683 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386695 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386708 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386723 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386738 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386752 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386765 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386780 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386794 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386808 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386823 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386838 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386852 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386864 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386876 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386888 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386904 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386920 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386934 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386953 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386968 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386982 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.386995 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387013 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387030 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387047 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387061 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387076 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387110 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387124 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387138 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387151 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387165 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387180 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387193 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387207 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387218 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387230 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387243 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387258 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387272 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387286 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387299 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387312 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387327 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387341 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387353 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387365 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387378 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387392 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387407 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387425 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387438 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387450 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387464 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387478 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387493 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387507 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387523 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387535 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387549 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387562 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387578 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387594 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387609 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387622 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387640 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387658 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387670 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387684 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387698 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387713 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387728 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.387742 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388126 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388168 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388191 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388277 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388289 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388308 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388322 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388333 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388346 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388358 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388400 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388417 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388434 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388449 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388487 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388501 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388515 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388527 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388538 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388555 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388565 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388581 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388592 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388604 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388619 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388631 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388643 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388653 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388667 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388679 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388692 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388707 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388718 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388728 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388741 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388753 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388766 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388778 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388792 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388807 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388817 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388829 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388842 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388852 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388867 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.388880 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389231 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389305 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389323 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389354 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389369 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389397 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389416 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389439 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389458 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389471 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389500 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389520 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389541 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389562 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389574 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389594 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389610 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389628 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389648 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389661 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389677 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389689 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389702 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389735 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389748 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389766 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389783 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389798 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389816 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389830 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389842 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389859 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389875 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389904 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389916 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389929 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389947 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389958 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389974 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.389989 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390001 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390015 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390026 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390044 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390665 4666 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390705 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390719 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390739 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390761 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390774 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390793 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390808 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390825 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390841 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390856 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390878 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390892 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390910 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390925 4666 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390935 4666 reconstruct.go:97] "Volume reconstruction finished" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.390942 4666 reconciler.go:26] "Reconciler: start to sync state" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.403802 4666 manager.go:324] Recovery completed Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.414902 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.417445 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.417497 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.417509 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.418462 4666 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.418481 4666 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.418504 4666 state_mem.go:36] "Initialized new in-memory state store" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.419136 4666 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.421834 4666 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.421926 4666 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.422268 4666 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.422328 4666 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.423033 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.423128 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.430609 4666 policy_none.go:49] "None policy: Start" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.434613 4666 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.434652 4666 state_mem.go:35] "Initializing new in-memory state store" Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.473313 4666 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.489497 4666 manager.go:334] "Starting Device Plugin manager" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.489566 4666 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.489581 4666 server.go:79] "Starting device plugin registration server" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.490035 4666 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.490066 4666 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.490236 4666 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.490347 4666 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.490355 4666 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.499067 4666 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.523233 4666 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.523361 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.524566 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.524625 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.524635 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.524811 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.525150 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.525217 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.526187 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.526232 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.526259 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.527510 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.528544 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.528714 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.528738 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.529907 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.529992 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.531197 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.531238 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.531251 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.531394 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.531675 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.531760 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532190 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532219 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532244 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532336 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532392 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532409 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532457 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532691 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.532754 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533252 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533290 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533310 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533319 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533353 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533369 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533636 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533675 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533964 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.533975 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.534724 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.534795 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.534823 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.578991 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.591019 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592278 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592365 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592394 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592412 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592434 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592486 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592569 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592760 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592835 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592847 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592764 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.592898 4666 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.593038 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.593203 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.593276 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.593338 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.593390 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.593391 4666 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.593438 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.593502 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.694680 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.694777 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.694822 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.694857 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.694880 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.694940 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.694893 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695012 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695041 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695038 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695164 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695017 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695210 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695064 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695081 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695282 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695063 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695267 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695366 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695402 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695422 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695444 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695465 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695486 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695533 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695622 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695683 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695793 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695854 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.695876 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.794471 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.796281 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.796331 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.796345 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.796371 4666 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.797037 4666 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.857195 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.869940 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.890566 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-51940bad4ddbcf9801a717631ded260de4fe4e9c752df22a49d0f417f4a3823d WatchSource:0}: Error finding container 51940bad4ddbcf9801a717631ded260de4fe4e9c752df22a49d0f417f4a3823d: Status 404 returned error can't find the container with id 51940bad4ddbcf9801a717631ded260de4fe4e9c752df22a49d0f417f4a3823d Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.894626 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e6d8bb5a19d23bf59fd0cbce78fcaf93d6b129501bbb54a1fdf5c74c9a708198 WatchSource:0}: Error finding container e6d8bb5a19d23bf59fd0cbce78fcaf93d6b129501bbb54a1fdf5c74c9a708198: Status 404 returned error can't find the container with id e6d8bb5a19d23bf59fd0cbce78fcaf93d6b129501bbb54a1fdf5c74c9a708198 Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.895581 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.913265 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: I1203 12:13:31.918437 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.922211 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0b67a09bce16883e1b1f597a313fb76ae8bd01835c9f3313caf1a470d40d1d32 WatchSource:0}: Error finding container 0b67a09bce16883e1b1f597a313fb76ae8bd01835c9f3313caf1a470d40d1d32: Status 404 returned error can't find the container with id 0b67a09bce16883e1b1f597a313fb76ae8bd01835c9f3313caf1a470d40d1d32 Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.937489 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-41ccb771ad77d1f471dfcd8537f264ef2c42216f525da439fb30eb1506ed94c2 WatchSource:0}: Error finding container 41ccb771ad77d1f471dfcd8537f264ef2c42216f525da439fb30eb1506ed94c2: Status 404 returned error can't find the container with id 41ccb771ad77d1f471dfcd8537f264ef2c42216f525da439fb30eb1506ed94c2 Dec 03 12:13:31 crc kubenswrapper[4666]: W1203 12:13:31.944439 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2bc4ca35c95eea359936f8d2f497f3f4fb8b4c6780a6c2c8a064f95ba684c444 WatchSource:0}: Error finding container 2bc4ca35c95eea359936f8d2f497f3f4fb8b4c6780a6c2c8a064f95ba684c444: Status 404 returned error can't find the container with id 2bc4ca35c95eea359936f8d2f497f3f4fb8b4c6780a6c2c8a064f95ba684c444 Dec 03 12:13:31 crc kubenswrapper[4666]: E1203 12:13:31.980100 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.198195 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.202634 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.202698 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.202711 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.202746 4666 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:13:32 crc kubenswrapper[4666]: E1203 12:13:32.203339 4666 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 12:13:32 crc kubenswrapper[4666]: W1203 12:13:32.298568 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:32 crc kubenswrapper[4666]: E1203 12:13:32.298670 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.369858 4666 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:32 crc kubenswrapper[4666]: W1203 12:13:32.388918 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:32 crc kubenswrapper[4666]: E1203 12:13:32.389060 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.427133 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e6d8bb5a19d23bf59fd0cbce78fcaf93d6b129501bbb54a1fdf5c74c9a708198"} Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.428916 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2bc4ca35c95eea359936f8d2f497f3f4fb8b4c6780a6c2c8a064f95ba684c444"} Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.430381 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"41ccb771ad77d1f471dfcd8537f264ef2c42216f525da439fb30eb1506ed94c2"} Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.431940 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0b67a09bce16883e1b1f597a313fb76ae8bd01835c9f3313caf1a470d40d1d32"} Dec 03 12:13:32 crc kubenswrapper[4666]: I1203 12:13:32.433286 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"51940bad4ddbcf9801a717631ded260de4fe4e9c752df22a49d0f417f4a3823d"} Dec 03 12:13:32 crc kubenswrapper[4666]: W1203 12:13:32.653630 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:32 crc kubenswrapper[4666]: E1203 12:13:32.653762 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:32 crc kubenswrapper[4666]: W1203 12:13:32.656637 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:32 crc kubenswrapper[4666]: E1203 12:13:32.657002 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 12:13:32 crc kubenswrapper[4666]: E1203 12:13:32.781938 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.004284 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.006666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.006708 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.006718 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.006745 4666 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:13:33 crc kubenswrapper[4666]: E1203 12:13:33.007391 4666 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.369809 4666 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.438333 4666 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf" exitCode=0 Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.438473 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.438520 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.439455 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f" exitCode=0 Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.439603 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.440184 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.440615 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.440638 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.440646 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.440686 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.440723 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.440736 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.442242 4666 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde" exitCode=0 Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.442290 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.442379 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.443002 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.443026 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.443036 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.444826 4666 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459" exitCode=0 Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.444870 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.444931 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.449161 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.450905 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.450938 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.450951 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.452772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.452791 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.452804 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.453860 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.453933 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.453954 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.453969 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554"} Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.454046 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.454910 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.454939 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:33 crc kubenswrapper[4666]: I1203 12:13:33.454949 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.027152 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.369520 4666 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 12:13:34 crc kubenswrapper[4666]: E1203 12:13:34.383903 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="3.2s" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.460705 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1db16d7efd060579c2c112c7ef17479c767cf7eb656936d78e8ee13a5213e0d2"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.460813 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.465559 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.465613 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.465628 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.467934 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.468345 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.468370 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.468381 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.468737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.468757 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.468767 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.472959 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.473015 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.473030 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.475032 4666 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e" exitCode=0 Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.475103 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e"} Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.475131 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.475238 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.476236 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.476248 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.476261 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.476267 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.476275 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.476277 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.607719 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.609171 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.609202 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.609212 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.609243 4666 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:13:34 crc kubenswrapper[4666]: I1203 12:13:34.722853 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.482512 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71"} Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.482571 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7"} Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.482720 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.484216 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.484256 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.484268 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.486437 4666 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9" exitCode=0 Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.486577 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.486482 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9"} Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.486684 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.486779 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.486806 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488465 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488501 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488519 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488540 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488544 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488565 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488580 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488628 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488653 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488670 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488551 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:35 crc kubenswrapper[4666]: I1203 12:13:35.488781 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.494214 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"655e3e64dcbdd148aced91acaa10b627bd23eda51dced477f3ee7b2cc74cc8b7"} Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.494283 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"52359e94945637cfe5a1b40c96bb5adfe7d99b4f2cdcc22c164f1b51b7299e51"} Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.494303 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bed75ae0c029eb4a1b2c56a1efed6a8867eba9df99a868001e5beef026c6874a"} Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.494315 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7554583c33429d01b442e078c6d75f9234971717fc5f99ac1d238d41ab3e6cbd"} Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.494325 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.494435 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.494476 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.495448 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.495482 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.495495 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.496333 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.496353 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:36 crc kubenswrapper[4666]: I1203 12:13:36.496364 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.027192 4666 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.027337 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.100831 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.501849 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a333daff0ee07b57450db848265866622a69262e0c671b52481b5d866da63d13"} Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.501951 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.502011 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.503320 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.503353 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.503364 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.504535 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.504569 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.504579 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.628191 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:37 crc kubenswrapper[4666]: I1203 12:13:37.840995 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.504905 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.504908 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.506033 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.506082 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.506112 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.506691 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.506771 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:38 crc kubenswrapper[4666]: I1203 12:13:38.506792 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.507217 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.511301 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.511448 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.511489 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.522884 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.523215 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.525158 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.525241 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:39 crc kubenswrapper[4666]: I1203 12:13:39.525269 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.213963 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.214263 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.215880 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.216179 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.216380 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.249593 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.509937 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.511652 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.511714 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:40 crc kubenswrapper[4666]: I1203 12:13:40.511737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:41 crc kubenswrapper[4666]: E1203 12:13:41.499347 4666 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.337691 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.337954 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.339923 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.339989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.340009 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.343232 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.518196 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.519549 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.519603 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.519637 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:43 crc kubenswrapper[4666]: I1203 12:13:43.522559 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:44 crc kubenswrapper[4666]: E1203 12:13:44.412834 4666 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187db384aa014f05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:13:31.368369925 +0000 UTC m=+0.213330976,LastTimestamp:2025-12-03 12:13:31.368369925 +0000 UTC m=+0.213330976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:13:44 crc kubenswrapper[4666]: I1203 12:13:44.520916 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:44 crc kubenswrapper[4666]: I1203 12:13:44.521875 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:44 crc kubenswrapper[4666]: I1203 12:13:44.521917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:44 crc kubenswrapper[4666]: I1203 12:13:44.521927 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:44 crc kubenswrapper[4666]: I1203 12:13:44.601965 4666 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 12:13:44 crc kubenswrapper[4666]: I1203 12:13:44.602051 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 12:13:44 crc kubenswrapper[4666]: E1203 12:13:44.610396 4666 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 03 12:13:44 crc kubenswrapper[4666]: W1203 12:13:44.753734 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 12:13:44 crc kubenswrapper[4666]: I1203 12:13:44.753848 4666 trace.go:236] Trace[1951393622]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:13:34.752) (total time: 10000ms): Dec 03 12:13:44 crc kubenswrapper[4666]: Trace[1951393622]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (12:13:44.753) Dec 03 12:13:44 crc kubenswrapper[4666]: Trace[1951393622]: [10.000955208s] [10.000955208s] END Dec 03 12:13:44 crc kubenswrapper[4666]: E1203 12:13:44.753881 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 12:13:45 crc kubenswrapper[4666]: W1203 12:13:45.294534 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 12:13:45 crc kubenswrapper[4666]: I1203 12:13:45.294674 4666 trace.go:236] Trace[1389389266]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:13:35.293) (total time: 10001ms): Dec 03 12:13:45 crc kubenswrapper[4666]: Trace[1389389266]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:13:45.294) Dec 03 12:13:45 crc kubenswrapper[4666]: Trace[1389389266]: [10.001224747s] [10.001224747s] END Dec 03 12:13:45 crc kubenswrapper[4666]: E1203 12:13:45.294705 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 12:13:45 crc kubenswrapper[4666]: W1203 12:13:45.342573 4666 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 12:13:45 crc kubenswrapper[4666]: I1203 12:13:45.342701 4666 trace.go:236] Trace[852623559]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:13:35.340) (total time: 10002ms): Dec 03 12:13:45 crc kubenswrapper[4666]: Trace[852623559]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (12:13:45.342) Dec 03 12:13:45 crc kubenswrapper[4666]: Trace[852623559]: [10.002140341s] [10.002140341s] END Dec 03 12:13:45 crc kubenswrapper[4666]: E1203 12:13:45.342743 4666 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 12:13:45 crc kubenswrapper[4666]: I1203 12:13:45.370619 4666 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 12:13:45 crc kubenswrapper[4666]: I1203 12:13:45.465480 4666 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 12:13:45 crc kubenswrapper[4666]: I1203 12:13:45.465559 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 12:13:45 crc kubenswrapper[4666]: I1203 12:13:45.469700 4666 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 12:13:45 crc kubenswrapper[4666]: I1203 12:13:45.469785 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 12:13:46 crc kubenswrapper[4666]: I1203 12:13:46.743428 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 12:13:46 crc kubenswrapper[4666]: I1203 12:13:46.743663 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:46 crc kubenswrapper[4666]: I1203 12:13:46.744968 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:46 crc kubenswrapper[4666]: I1203 12:13:46.745015 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:46 crc kubenswrapper[4666]: I1203 12:13:46.745034 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:46 crc kubenswrapper[4666]: I1203 12:13:46.787146 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.028111 4666 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.028206 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.531965 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.533299 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.533341 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.533354 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.546369 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.633303 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.633573 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.635121 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.635164 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.635176 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.641865 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.811161 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.812769 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.812832 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.812843 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:47 crc kubenswrapper[4666]: I1203 12:13:47.812885 4666 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:13:47 crc kubenswrapper[4666]: E1203 12:13:47.816872 4666 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.535119 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.535136 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.535161 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.536147 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.536241 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.536256 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.536245 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.536332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:48 crc kubenswrapper[4666]: I1203 12:13:48.536346 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:49 crc kubenswrapper[4666]: I1203 12:13:49.127271 4666 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 12:13:49 crc kubenswrapper[4666]: I1203 12:13:49.161506 4666 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 12:13:50 crc kubenswrapper[4666]: E1203 12:13:50.462349 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.463748 4666 trace.go:236] Trace[1642980719]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 12:13:35.798) (total time: 14665ms): Dec 03 12:13:50 crc kubenswrapper[4666]: Trace[1642980719]: ---"Objects listed" error: 14665ms (12:13:50.463) Dec 03 12:13:50 crc kubenswrapper[4666]: Trace[1642980719]: [14.66548889s] [14.66548889s] END Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.463782 4666 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.464877 4666 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.500729 4666 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37070->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.500818 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37070->192.168.126.11:17697: read: connection reset by peer" Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.501344 4666 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.501465 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 12:13:50 crc kubenswrapper[4666]: I1203 12:13:50.726870 4666 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.374517 4666 apiserver.go:52] "Watching apiserver" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.377651 4666 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.378103 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-dns/node-resolver-vzctp","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-multus/multus-wbdks","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-machine-config-operator/machine-config-daemon-q9g72","openshift-multus/multus-additional-cni-plugins-p6hxn"] Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.378563 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.378591 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.378666 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.378673 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.378833 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.378903 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.379047 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.379169 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.379330 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.379479 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.379589 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.379650 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.379688 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.381747 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.381927 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.382832 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.383077 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.383762 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.383772 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.384634 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.384960 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.385384 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.385587 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.385861 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.385879 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390456 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390518 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390688 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390772 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390798 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390817 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390843 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390851 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.390975 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.391368 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.391639 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.393954 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.411227 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.432554 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.448076 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.459962 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.473563 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.474979 4666 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475415 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475468 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475494 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475516 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475538 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475557 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475582 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475606 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475628 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475648 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475669 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475689 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475708 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475725 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475748 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475769 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475786 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475806 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475864 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475886 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475903 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475947 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475963 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475982 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475999 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476016 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.475997 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476020 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476034 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476165 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476200 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476230 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476253 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476255 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476263 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476272 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476300 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476322 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476348 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476370 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476394 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476447 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476474 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476482 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476570 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476588 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476620 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476772 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476499 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476876 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476896 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476895 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476913 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476954 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476973 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476992 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477009 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477028 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477047 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477067 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477101 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477125 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477146 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477202 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477222 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477241 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477260 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477282 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477302 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477325 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477343 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477360 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477376 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477396 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477415 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477434 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477451 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477470 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477488 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477507 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477525 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477577 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477594 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477615 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477634 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477654 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477674 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477691 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477710 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477728 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477744 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477763 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477784 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477801 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477818 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477835 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477855 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477875 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477895 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477914 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477937 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477954 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477972 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477991 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478042 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478058 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478080 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478111 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478129 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478148 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478165 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478183 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478202 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478218 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478235 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478252 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478268 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478285 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478304 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478321 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478337 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478356 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478373 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478391 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478406 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478435 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478452 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478469 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478484 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478503 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478521 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478538 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478555 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478574 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478590 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478610 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478627 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478644 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478662 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478680 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478695 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478712 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478728 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478745 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478761 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478779 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478796 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478816 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478832 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478853 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478873 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478890 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478907 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478925 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478943 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478961 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478981 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479000 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479019 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479036 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479054 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479074 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479108 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479130 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479165 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479185 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479202 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479219 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479314 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479336 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479355 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479373 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479391 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479408 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479426 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479444 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479463 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479484 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479501 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479520 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479538 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479557 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479575 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479595 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479613 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479630 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479647 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482291 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476907 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.476999 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477167 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477207 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482394 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482475 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482523 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482647 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482706 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482746 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482786 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482845 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482884 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482928 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482968 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.483002 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.483045 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.483111 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.483155 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.483187 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.483223 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.490761 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.490852 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.490896 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.490931 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-cni-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.490974 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.491021 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.491058 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99024354-6b69-4788-9f26-2f7fbef66e7f-hosts-file\") pod \"node-resolver-vzctp\" (UID: \"99024354-6b69-4788-9f26-2f7fbef66e7f\") " pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492286 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-rootfs\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492344 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-conf-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492380 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba134276-4c96-4ba6-b18f-276b312a7355-multus-daemon-config\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492416 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492449 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492489 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492528 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492555 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492589 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-k8s-cni-cncf-io\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492622 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-multus-certs\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492655 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67km\" (UniqueName: \"kubernetes.io/projected/ba134276-4c96-4ba6-b18f-276b312a7355-kube-api-access-b67km\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492689 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-socket-dir-parent\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492723 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-cni-bin\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492774 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-etc-kubernetes\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492818 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-system-cni-dir\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492850 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cnibin\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492880 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6kq\" (UniqueName: \"kubernetes.io/projected/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-kube-api-access-vd6kq\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492920 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492954 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-netns\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.492992 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493047 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctgq5\" (UniqueName: \"kubernetes.io/projected/99024354-6b69-4788-9f26-2f7fbef66e7f-kube-api-access-ctgq5\") pod \"node-resolver-vzctp\" (UID: \"99024354-6b69-4788-9f26-2f7fbef66e7f\") " pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493082 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-mcd-auth-proxy-config\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493135 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-kubelet\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493171 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-hostroot\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493208 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493240 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493276 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493311 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-proxy-tls\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493345 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-cni-multus\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.498284 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.505443 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.505917 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.506474 4666 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511339 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511665 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511712 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-os-release\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511754 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj868\" (UniqueName: \"kubernetes.io/projected/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-kube-api-access-gj868\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511788 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-os-release\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511816 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-system-cni-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511842 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-cnibin\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511870 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba134276-4c96-4ba6-b18f-276b312a7355-cni-binary-copy\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512041 4666 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512069 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512206 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512224 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512240 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512254 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512273 4666 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512287 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512303 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512318 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512338 4666 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512353 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.542026 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.548330 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477227 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477233 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477357 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477426 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477462 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477554 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477582 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477738 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477830 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477854 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.477978 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478058 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478113 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478205 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478269 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478371 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478397 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478454 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478516 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478659 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478683 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478848 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478892 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.478919 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479028 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479115 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479302 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479396 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479502 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.479669 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.480539 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.480876 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.481127 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.481190 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.481237 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.481484 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.481789 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482276 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482509 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482664 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482663 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482799 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.482871 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.483053 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.484956 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.485260 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.493105 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.494247 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.494759 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.494954 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.495157 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.495350 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.497129 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.554370 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.554492 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.554677 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:52.054007153 +0000 UTC m=+20.898968204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.555186 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.555610 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.555677 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.555718 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.555974 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.556267 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.556738 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.557188 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.557816 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.558498 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.558866 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.559570 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.559433 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.561296 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.561351 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.561385 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.509912 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.510503 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.510767 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511009 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511173 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511602 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.511932 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512200 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.512536 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.513150 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.513537 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.513890 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.514684 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.515079 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.515354 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.515574 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.515796 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.528650 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.531232 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.495700 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.532306 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.532362 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.532794 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.533033 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.533225 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.533479 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.533880 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.534210 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.534261 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.536307 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.538575 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.538850 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.539045 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.539112 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.539643 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.540364 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.540511 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.540566 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.547721 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.547957 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.551530 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.551628 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.552641 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.561745 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.562136 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.562559 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.562775 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.563336 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.563376 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:13:52.062259441 +0000 UTC m=+20.907220492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.497674 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.563768 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.563869 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.563935 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564009 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564361 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564471 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564369 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564635 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564705 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564757 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564851 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.564978 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.565288 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.565330 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.565866 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.565905 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.565927 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.566436 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.566719 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.566754 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:52.063901974 +0000 UTC m=+20.908863025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.566841 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.567316 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.567421 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.567463 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.567479 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.567870 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.567897 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.568158 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.568330 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.576451 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.576883 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.576960 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.577270 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.577324 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.577882 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.578563 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.579124 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.579284 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:52.07926059 +0000 UTC m=+20.924221641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.579377 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.579536 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.579802 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.579933 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.580122 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.580350 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.580544 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.580619 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.580837 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.581145 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.582671 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.583156 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.584060 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.584650 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.584976 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.585113 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.585136 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.585152 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.585343 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: E1203 12:13:51.585380 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:52.085347331 +0000 UTC m=+20.930308372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.585801 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71" exitCode=255 Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.585848 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71"} Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.586316 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.586471 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.586762 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.588307 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.588555 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.588563 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.589266 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.589764 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.590181 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.596998 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.597383 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.599241 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.599379 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.599415 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.599785 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.601013 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.601533 4666 scope.go:117] "RemoveContainer" containerID="3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.601871 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.611039 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.614257 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99024354-6b69-4788-9f26-2f7fbef66e7f-hosts-file\") pod \"node-resolver-vzctp\" (UID: \"99024354-6b69-4788-9f26-2f7fbef66e7f\") " pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.614450 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-rootfs\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.614721 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-conf-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.614801 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba134276-4c96-4ba6-b18f-276b312a7355-multus-daemon-config\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.614872 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.615007 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-conf-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.615018 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.615113 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.615145 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-k8s-cni-cncf-io\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.615816 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.614661 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-rootfs\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616518 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba134276-4c96-4ba6-b18f-276b312a7355-multus-daemon-config\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616713 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-multus-certs\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616749 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b67km\" (UniqueName: \"kubernetes.io/projected/ba134276-4c96-4ba6-b18f-276b312a7355-kube-api-access-b67km\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616769 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-etc-kubernetes\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616789 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-socket-dir-parent\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616807 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-cni-bin\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616829 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6kq\" (UniqueName: \"kubernetes.io/projected/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-kube-api-access-vd6kq\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616874 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-netns\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616890 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-system-cni-dir\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616909 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cnibin\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616928 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctgq5\" (UniqueName: \"kubernetes.io/projected/99024354-6b69-4788-9f26-2f7fbef66e7f-kube-api-access-ctgq5\") pod \"node-resolver-vzctp\" (UID: \"99024354-6b69-4788-9f26-2f7fbef66e7f\") " pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616945 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-mcd-auth-proxy-config\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616961 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-kubelet\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.616976 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-hostroot\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617006 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-proxy-tls\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617027 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-cni-multus\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617060 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-os-release\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617088 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-os-release\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617130 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-system-cni-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617159 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-cnibin\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617180 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba134276-4c96-4ba6-b18f-276b312a7355-cni-binary-copy\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617205 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj868\" (UniqueName: \"kubernetes.io/projected/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-kube-api-access-gj868\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617233 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617257 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617278 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-cni-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617384 4666 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617402 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617416 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617431 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617446 4666 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617460 4666 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617482 4666 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617498 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617517 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617531 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617545 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617559 4666 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617570 4666 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617583 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617596 4666 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617607 4666 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617619 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617659 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617672 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617685 4666 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617697 4666 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.614629 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99024354-6b69-4788-9f26-2f7fbef66e7f-hosts-file\") pod \"node-resolver-vzctp\" (UID: \"99024354-6b69-4788-9f26-2f7fbef66e7f\") " pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617710 4666 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617723 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617736 4666 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617748 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617759 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617774 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617785 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617796 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617809 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617819 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617832 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617867 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617878 4666 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617889 4666 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617901 4666 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617911 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617928 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617941 4666 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617952 4666 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617974 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617984 4666 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.617994 4666 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618003 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618014 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618023 4666 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618032 4666 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618042 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618052 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618057 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618062 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618113 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618124 4666 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618135 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618140 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-cni-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618131 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618521 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-os-release\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618145 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618675 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618692 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-multus-certs\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618705 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618717 4666 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618727 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618741 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618753 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618767 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618781 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618793 4666 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618805 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618816 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618820 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-etc-kubernetes\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618828 4666 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618838 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-kubelet\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618850 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618882 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-multus-socket-dir-parent\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618885 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618904 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618906 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-hostroot\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618916 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618924 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-system-cni-dir\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618926 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618963 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618969 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-cni-bin\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618977 4666 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618996 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619014 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619047 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619065 4666 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619078 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619102 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619088 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-cnibin\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619079 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-var-lib-cni-multus\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619112 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619137 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619151 4666 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619162 4666 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619049 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619172 4666 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.618653 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-k8s-cni-cncf-io\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619217 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-os-release\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619290 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619261 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cnibin\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619316 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-system-cni-dir\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619337 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619298 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba134276-4c96-4ba6-b18f-276b312a7355-host-run-netns\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619355 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619394 4666 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619414 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619427 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619439 4666 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619449 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619458 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619468 4666 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619479 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619488 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619502 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619513 4666 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619522 4666 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619532 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619540 4666 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619552 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619561 4666 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619571 4666 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619580 4666 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619589 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619599 4666 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619609 4666 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619621 4666 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619629 4666 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619638 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619648 4666 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619657 4666 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619669 4666 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619679 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619689 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619697 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619707 4666 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619718 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619728 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619736 4666 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619745 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619754 4666 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619763 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619772 4666 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619782 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619792 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619800 4666 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619809 4666 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619811 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba134276-4c96-4ba6-b18f-276b312a7355-cni-binary-copy\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619818 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619855 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619866 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619877 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619887 4666 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619912 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619927 4666 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619937 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619947 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619958 4666 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619968 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619976 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619986 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619910 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-mcd-auth-proxy-config\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.619996 4666 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620042 4666 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620053 4666 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620066 4666 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620078 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620102 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620114 4666 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620124 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620140 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620150 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620160 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620170 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620181 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620193 4666 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620204 4666 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620213 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620224 4666 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620233 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620243 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620269 4666 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620279 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620289 4666 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620298 4666 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620307 4666 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620318 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620327 4666 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620337 4666 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620347 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620356 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620367 4666 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620376 4666 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620386 4666 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620396 4666 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620405 4666 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620415 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620426 4666 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620435 4666 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620446 4666 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620458 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.620468 4666 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.623503 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.623649 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.623754 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-proxy-tls\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.633310 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctgq5\" (UniqueName: \"kubernetes.io/projected/99024354-6b69-4788-9f26-2f7fbef66e7f-kube-api-access-ctgq5\") pod \"node-resolver-vzctp\" (UID: \"99024354-6b69-4788-9f26-2f7fbef66e7f\") " pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.633797 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67km\" (UniqueName: \"kubernetes.io/projected/ba134276-4c96-4ba6-b18f-276b312a7355-kube-api-access-b67km\") pod \"multus-wbdks\" (UID: \"ba134276-4c96-4ba6-b18f-276b312a7355\") " pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.636614 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6kq\" (UniqueName: \"kubernetes.io/projected/bea0ec2c-aed9-4ff3-9f36-48d3106926b5-kube-api-access-vd6kq\") pod \"multus-additional-cni-plugins-p6hxn\" (UID: \"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\") " pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.636755 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj868\" (UniqueName: \"kubernetes.io/projected/782e76d3-8dbe-4c2e-952c-6a966e2c06a2-kube-api-access-gj868\") pod \"machine-config-daemon-q9g72\" (UID: \"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\") " pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.639240 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.644621 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mh5x5"] Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.645533 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650148 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650150 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650223 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650419 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650154 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650635 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650682 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.650759 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.659699 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.672123 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.685225 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.696165 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.699246 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.702712 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.710801 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.711245 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.718874 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vzctp" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721363 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721412 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-slash\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721463 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-netd\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721488 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-config\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721513 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-node-log\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721539 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721572 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-netns\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721594 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-bin\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721633 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-log-socket\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721657 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-env-overrides\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721684 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-var-lib-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721718 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-ovn\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721741 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-script-lib\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721771 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whx68\" (UniqueName: \"kubernetes.io/projected/6fce11cd-ec4a-4e25-9483-21a8a45f332c-kube-api-access-whx68\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721795 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-systemd-units\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721822 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-kubelet\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721867 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-systemd\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721890 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721951 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-etc-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.721979 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovn-node-metrics-cert\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.722229 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.723277 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.726344 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wbdks" Dec 03 12:13:51 crc kubenswrapper[4666]: W1203 12:13:51.731205 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-631678d8df7680d4891519cac99e263f7763495c6a8194611e83ebf09103f54b WatchSource:0}: Error finding container 631678d8df7680d4891519cac99e263f7763495c6a8194611e83ebf09103f54b: Status 404 returned error can't find the container with id 631678d8df7680d4891519cac99e263f7763495c6a8194611e83ebf09103f54b Dec 03 12:13:51 crc kubenswrapper[4666]: W1203 12:13:51.735160 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-f2bd47739d58d9b8242c1538f9a083493a2db1ea4308d4c0e58bcd994649b4ca WatchSource:0}: Error finding container f2bd47739d58d9b8242c1538f9a083493a2db1ea4308d4c0e58bcd994649b4ca: Status 404 returned error can't find the container with id f2bd47739d58d9b8242c1538f9a083493a2db1ea4308d4c0e58bcd994649b4ca Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.736796 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.738257 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.745794 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.753094 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: W1203 12:13:51.759396 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba134276_4c96_4ba6_b18f_276b312a7355.slice/crio-907740b86e42fa967071b9f3f52ffe008d204797bf40ee71969f1892fec9a40c WatchSource:0}: Error finding container 907740b86e42fa967071b9f3f52ffe008d204797bf40ee71969f1892fec9a40c: Status 404 returned error can't find the container with id 907740b86e42fa967071b9f3f52ffe008d204797bf40ee71969f1892fec9a40c Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.771105 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.784055 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.798588 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: W1203 12:13:51.806057 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea0ec2c_aed9_4ff3_9f36_48d3106926b5.slice/crio-083bc7d9caa6378db9333ab25f19dc9a58ccd0897eeb1aa4d2991533e07a2942 WatchSource:0}: Error finding container 083bc7d9caa6378db9333ab25f19dc9a58ccd0897eeb1aa4d2991533e07a2942: Status 404 returned error can't find the container with id 083bc7d9caa6378db9333ab25f19dc9a58ccd0897eeb1aa4d2991533e07a2942 Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.816708 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823000 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-log-socket\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823041 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-env-overrides\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823140 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-var-lib-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823162 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-ovn\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823182 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-script-lib\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823198 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whx68\" (UniqueName: \"kubernetes.io/projected/6fce11cd-ec4a-4e25-9483-21a8a45f332c-kube-api-access-whx68\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823214 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-systemd-units\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823230 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-kubelet\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823247 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-systemd\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823265 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823296 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-etc-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823312 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovn-node-metrics-cert\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823329 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823347 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-slash\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823371 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-netd\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823390 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-config\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823405 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-node-log\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823421 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823448 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-netns\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823463 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-bin\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823535 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-bin\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823647 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-netd\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823647 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-systemd-units\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.823775 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-kubelet\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824154 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-node-log\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824195 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-etc-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824207 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824225 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-netns\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824254 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-var-lib-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824258 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-slash\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824282 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-log-socket\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824323 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-systemd\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824297 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-ovn\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824358 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824362 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-config\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.824436 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-openvswitch\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.825698 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-env-overrides\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.826821 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.827451 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-script-lib\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.835988 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovn-node-metrics-cert\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.841343 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.845576 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whx68\" (UniqueName: \"kubernetes.io/projected/6fce11cd-ec4a-4e25-9483-21a8a45f332c-kube-api-access-whx68\") pod \"ovnkube-node-mh5x5\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.854755 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.866032 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.878352 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.892755 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.905257 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.919399 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.932613 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.951896 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.964210 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.966448 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.988864 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:51 crc kubenswrapper[4666]: I1203 12:13:51.997808 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.009797 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.127469 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.127633 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.127670 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.127699 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.127720 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.127813 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.127878 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:53.127859634 +0000 UTC m=+21.972820685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.127937 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:13:53.127928406 +0000 UTC m=+21.972889457 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128032 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128049 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128065 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128114 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:53.128104641 +0000 UTC m=+21.973065702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128174 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128185 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128195 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128220 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:53.128212424 +0000 UTC m=+21.973173475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128275 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.128299 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:53.128291406 +0000 UTC m=+21.973252447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.423309 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.423479 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.423563 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:52 crc kubenswrapper[4666]: E1203 12:13:52.423615 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.592098 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.592160 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.592177 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"631678d8df7680d4891519cac99e263f7763495c6a8194611e83ebf09103f54b"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.593815 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2ee5e910c3309790952d25e556f4eebd4cc74f0c0d0bebde952f07d149e3eb0a"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.594990 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerStarted","Data":"7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.595015 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerStarted","Data":"907740b86e42fa967071b9f3f52ffe008d204797bf40ee71969f1892fec9a40c"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.597006 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.598500 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.598755 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.600495 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.600534 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.600551 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"420f0ed0c4fdc11298c6f06cd1fa647bc5eda5bf740049a39e731e4a9484fbc1"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.601931 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vzctp" event={"ID":"99024354-6b69-4788-9f26-2f7fbef66e7f","Type":"ContainerStarted","Data":"7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.601958 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vzctp" event={"ID":"99024354-6b69-4788-9f26-2f7fbef66e7f","Type":"ContainerStarted","Data":"8d43bce91ac7df97583cc9def7eb491f1c453211cd292da32d50942523ff2244"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.605593 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.605670 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f2bd47739d58d9b8242c1538f9a083493a2db1ea4308d4c0e58bcd994649b4ca"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.606983 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerStarted","Data":"965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.607032 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerStarted","Data":"083bc7d9caa6378db9333ab25f19dc9a58ccd0897eeb1aa4d2991533e07a2942"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.608399 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3" exitCode=0 Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.608478 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.608516 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"1fb6d1da6897c031ffd99693a1ec02f1feb37712a63fa0cdf6e4e412874621fd"} Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.622460 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.640848 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.655542 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.675070 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.696505 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.718174 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.737758 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.765740 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.785027 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.802038 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.824035 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.840828 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.862996 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.880578 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.893489 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.910009 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.923462 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.948100 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.964822 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:52 crc kubenswrapper[4666]: I1203 12:13:52.982605 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.002506 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.022185 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.040935 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.064001 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.136139 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.136258 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136378 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:13:55.136321168 +0000 UTC m=+23.981282219 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136498 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.136548 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.136640 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136569 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.136696 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136708 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136759 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136783 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:55.13677078 +0000 UTC m=+23.981732011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136674 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136827 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:55.136810011 +0000 UTC m=+23.981771062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136859 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:55.136850492 +0000 UTC m=+23.981811763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136910 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136935 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.136952 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.137036 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:55.137012917 +0000 UTC m=+23.981974128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.422680 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:53 crc kubenswrapper[4666]: E1203 12:13:53.422871 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.426966 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.427803 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.428534 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.429200 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.429809 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.430374 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.430984 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.432756 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.434624 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.435698 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.436817 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.439040 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.445946 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.446756 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.447526 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.448899 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.449761 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.451056 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.451869 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.452889 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.454804 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.456275 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.459153 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.460150 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.460925 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.462698 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.463731 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.467492 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.469251 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.472542 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.473252 4666 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.473399 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.477863 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.478837 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.479509 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.482570 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.484016 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.484880 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.486529 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.487606 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.489620 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.490252 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.491038 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.491819 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.492498 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.494633 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.495300 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.496960 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.498268 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.500308 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.501323 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.502492 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.504677 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.505323 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.624995 4666 generic.go:334] "Generic (PLEG): container finished" podID="bea0ec2c-aed9-4ff3-9f36-48d3106926b5" containerID="965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47" exitCode=0 Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.625112 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerDied","Data":"965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47"} Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.629321 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3"} Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.629377 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a"} Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.629398 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60"} Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.649138 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.666055 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.677671 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.696290 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.712506 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.738957 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.758237 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.788141 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.804580 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.824083 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.840848 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.857017 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.976157 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4lgm9"] Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.976683 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.979764 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.980281 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.980646 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.982408 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 12:13:53 crc kubenswrapper[4666]: I1203 12:13:53.995767 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.019708 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.032858 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.035813 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.037791 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.045684 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49c29a00-1d2c-4222-9f43-e125c87085c5-serviceca\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.045779 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49c29a00-1d2c-4222-9f43-e125c87085c5-host\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.045905 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7g7\" (UniqueName: \"kubernetes.io/projected/49c29a00-1d2c-4222-9f43-e125c87085c5-kube-api-access-gj7g7\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.046419 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.050641 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.068521 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.083498 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.105272 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.132510 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.147158 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7g7\" (UniqueName: \"kubernetes.io/projected/49c29a00-1d2c-4222-9f43-e125c87085c5-kube-api-access-gj7g7\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.147256 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49c29a00-1d2c-4222-9f43-e125c87085c5-serviceca\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.147324 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49c29a00-1d2c-4222-9f43-e125c87085c5-host\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.147444 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49c29a00-1d2c-4222-9f43-e125c87085c5-host\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.148459 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49c29a00-1d2c-4222-9f43-e125c87085c5-serviceca\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.149997 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.169600 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7g7\" (UniqueName: \"kubernetes.io/projected/49c29a00-1d2c-4222-9f43-e125c87085c5-kube-api-access-gj7g7\") pod \"node-ca-4lgm9\" (UID: \"49c29a00-1d2c-4222-9f43-e125c87085c5\") " pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.174112 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.187749 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.200060 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.213363 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.217328 4666 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.219756 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.219815 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.219834 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.219999 4666 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.228800 4666 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.229078 4666 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.230016 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.230044 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.230056 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.230075 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.230113 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.231901 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.249521 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.253252 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.256749 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.256785 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.256794 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.256810 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.256821 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.264640 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.268281 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.271470 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.271500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.271511 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.271526 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.271538 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.276644 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.283821 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.292586 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.292622 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.292635 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.292653 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.292664 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.292782 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.294166 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4lgm9" Dec 03 12:13:54 crc kubenswrapper[4666]: W1203 12:13:54.305815 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49c29a00_1d2c_4222_9f43_e125c87085c5.slice/crio-5e6a76d38fda6421184838a028b153104418e25825140e57bff10bb3f0f6ed4a WatchSource:0}: Error finding container 5e6a76d38fda6421184838a028b153104418e25825140e57bff10bb3f0f6ed4a: Status 404 returned error can't find the container with id 5e6a76d38fda6421184838a028b153104418e25825140e57bff10bb3f0f6ed4a Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.312633 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.317668 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.317710 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.317730 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.317750 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.317763 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.318712 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.333300 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.341570 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.341739 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.344529 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.344566 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.344588 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.344683 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.344698 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.353442 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.376407 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.395193 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.419074 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.422540 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.422696 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.422732 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.423008 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.434895 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.445646 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.448672 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.448734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.448747 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.448812 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.448829 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.459067 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.551293 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.551344 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.551357 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.551374 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.551385 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.635815 4666 generic.go:334] "Generic (PLEG): container finished" podID="bea0ec2c-aed9-4ff3-9f36-48d3106926b5" containerID="6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587" exitCode=0 Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.635876 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerDied","Data":"6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.642241 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.642306 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.642320 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.645193 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4lgm9" event={"ID":"49c29a00-1d2c-4222-9f43-e125c87085c5","Type":"ContainerStarted","Data":"b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.645261 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4lgm9" event={"ID":"49c29a00-1d2c-4222-9f43-e125c87085c5","Type":"ContainerStarted","Data":"5e6a76d38fda6421184838a028b153104418e25825140e57bff10bb3f0f6ed4a"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.654512 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.654567 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.654577 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.654598 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.654615 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: E1203 12:13:54.655871 4666 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.664452 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.679983 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.703003 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.717909 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.734307 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.748949 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.757443 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.757489 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.757500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.757519 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.757531 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.764721 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.780024 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.792109 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.803225 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.819059 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.860055 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.860130 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.860144 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.860166 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.860180 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.860541 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.904006 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.941997 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.962619 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.962666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.962675 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.962691 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.962700 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:54Z","lastTransitionTime":"2025-12-03T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:54 crc kubenswrapper[4666]: I1203 12:13:54.983814 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.025022 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.065247 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.065775 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.065827 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.065843 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.065863 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.065877 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.101207 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.142179 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.154992 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155238 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:13:59.155204353 +0000 UTC m=+28.000165404 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.155351 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.155381 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.155409 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.155438 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155569 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155611 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155620 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155639 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155643 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155656 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155660 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155675 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:59.155649445 +0000 UTC m=+28.000610496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155582 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155713 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:59.155692886 +0000 UTC m=+28.000654127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155738 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:59.155729597 +0000 UTC m=+28.000690888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.155761 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:13:59.155753768 +0000 UTC m=+28.000714819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.169000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.169035 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.169047 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.169066 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.169077 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.182561 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.222697 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.263012 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.272364 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.272415 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.272429 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.272451 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.272462 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.307903 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.345350 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.376343 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.376432 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.376452 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.376473 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.376486 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.383680 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.420927 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.423157 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:55 crc kubenswrapper[4666]: E1203 12:13:55.423318 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.459534 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.478912 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.478954 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.478963 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.478979 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.478988 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.500644 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.581702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.581741 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.581750 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.581772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.581791 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.649890 4666 generic.go:334] "Generic (PLEG): container finished" podID="bea0ec2c-aed9-4ff3-9f36-48d3106926b5" containerID="45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf" exitCode=0 Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.649967 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerDied","Data":"45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.651797 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.665849 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.679154 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.684152 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.684194 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.684204 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.684231 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.684242 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.692814 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.713380 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.735060 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.748385 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.783695 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.786987 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.787028 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.787039 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.787054 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.787066 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.833767 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.866038 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.896460 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.896514 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.896528 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.896549 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.896564 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.908060 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.941453 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.982818 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.999422 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.999461 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.999472 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.999496 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:55 crc kubenswrapper[4666]: I1203 12:13:55.999514 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:55Z","lastTransitionTime":"2025-12-03T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.023917 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.061669 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.098467 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.108997 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.109046 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.109067 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.109117 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.109135 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.144937 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.183231 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.211881 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.211976 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.212024 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.212061 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.212115 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.222375 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.264342 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.301474 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.315312 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.315351 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.315362 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.315383 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.315397 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.350428 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.384851 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.417727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.417799 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.417818 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.417847 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.417866 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.422919 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.422939 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:56 crc kubenswrapper[4666]: E1203 12:13:56.423039 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:13:56 crc kubenswrapper[4666]: E1203 12:13:56.423182 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.431186 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.464378 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.502502 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.520757 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.520794 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.520807 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.520828 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.520842 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.544944 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.585862 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.623711 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.623786 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.623810 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.623860 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.623899 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.626848 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.658163 4666 generic.go:334] "Generic (PLEG): container finished" podID="bea0ec2c-aed9-4ff3-9f36-48d3106926b5" containerID="2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075" exitCode=0 Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.658282 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerDied","Data":"2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.669761 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.678526 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.708438 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.726808 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.726887 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.726906 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.726935 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.726954 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.743243 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.782228 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.823283 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.829515 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.829556 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.829568 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.829588 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.829601 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.862338 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.902756 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.932386 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.932422 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.932432 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.932447 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.932456 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:56Z","lastTransitionTime":"2025-12-03T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.945252 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:56 crc kubenswrapper[4666]: I1203 12:13:56.980537 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.021938 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.035283 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.035315 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.035326 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.035342 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.035352 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.063048 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.105692 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.139179 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.139254 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.139272 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.139339 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.139359 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.150819 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.189509 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.243064 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.243118 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.243131 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.243149 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.243160 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.347306 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.347396 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.347412 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.347434 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.347449 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.426070 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:57 crc kubenswrapper[4666]: E1203 12:13:57.426236 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.450421 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.450478 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.450492 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.450513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.450528 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.554030 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.554170 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.554190 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.554220 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.554238 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.658778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.658865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.658889 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.658986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.659018 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.677490 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerStarted","Data":"f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.697886 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.724560 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.738610 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.760650 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.762051 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.762119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.762140 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.762172 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.762190 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.779428 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.798967 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.817433 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.832026 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.846702 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.860938 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.865673 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.865734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.865751 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.865773 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.865789 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.877977 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.896777 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.912334 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.929258 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.969029 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.969111 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.969123 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.969148 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:57 crc kubenswrapper[4666]: I1203 12:13:57.969164 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:57Z","lastTransitionTime":"2025-12-03T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.071706 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.071751 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.071763 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.071781 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.071790 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.174781 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.174874 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.174902 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.174937 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.174962 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.281954 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.281993 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.282003 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.282021 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.282032 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.384750 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.384791 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.384802 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.384820 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.384830 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.423473 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.423537 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:58 crc kubenswrapper[4666]: E1203 12:13:58.423636 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:13:58 crc kubenswrapper[4666]: E1203 12:13:58.423735 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.487934 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.488319 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.488328 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.488350 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.488363 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.591217 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.591263 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.591306 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.591328 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.591340 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.689760 4666 generic.go:334] "Generic (PLEG): container finished" podID="bea0ec2c-aed9-4ff3-9f36-48d3106926b5" containerID="f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d" exitCode=0 Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.689816 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerDied","Data":"f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.693505 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.693538 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.693550 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.693572 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.693585 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.698803 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.699346 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.699400 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.706722 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.720309 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.737344 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.748332 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.753983 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.758000 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.774586 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.786038 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.798712 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.798761 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.798772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.798790 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.798799 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.800666 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.815465 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.830008 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.853602 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.880992 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.897776 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.900613 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.900653 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.900664 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.900682 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.900692 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:58Z","lastTransitionTime":"2025-12-03T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.912901 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.924047 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.934140 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.946009 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.960896 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.976253 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:58 crc kubenswrapper[4666]: I1203 12:13:58.991989 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.003380 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.003422 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.003432 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.003449 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.003459 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.006976 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.025834 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.038259 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.053968 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.068144 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.081044 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.098785 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.110317 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.110378 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.110393 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.110438 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.110456 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.122912 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.137140 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.199873 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200023 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:07.199992025 +0000 UTC m=+36.044953076 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.200065 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.200133 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200164 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.200169 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.200206 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200261 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:07.200249592 +0000 UTC m=+36.045210643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200350 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200385 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:07.200375165 +0000 UTC m=+36.045336216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200438 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200483 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200502 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200579 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:07.20055258 +0000 UTC m=+36.045513631 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200698 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200715 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200725 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.200757 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:07.200749366 +0000 UTC m=+36.045710417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.212562 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.212608 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.212618 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.212667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.212680 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.316228 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.316276 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.316287 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.316307 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.316319 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.419415 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.419790 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.419982 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.420273 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.420501 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.423051 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:13:59 crc kubenswrapper[4666]: E1203 12:13:59.423189 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.524588 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.524679 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.524696 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.525022 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.525046 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.629241 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.629283 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.629295 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.629315 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.629327 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.706205 4666 generic.go:334] "Generic (PLEG): container finished" podID="bea0ec2c-aed9-4ff3-9f36-48d3106926b5" containerID="dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520" exitCode=0 Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.706348 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.706706 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerDied","Data":"dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.731506 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.734218 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.734262 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.734273 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.734291 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.734300 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.746832 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.761840 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.776273 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.792801 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.806017 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.820312 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.835826 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.837160 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.837221 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.837238 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.837264 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.837279 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.851171 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.864029 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.877616 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.892226 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.910239 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.926857 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.940584 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.940628 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.940639 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.940656 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:13:59 crc kubenswrapper[4666]: I1203 12:13:59.940668 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:13:59Z","lastTransitionTime":"2025-12-03T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.042931 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.042987 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.043001 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.043020 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.043033 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.146600 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.146647 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.146664 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.146692 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.146704 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.252027 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.252078 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.252101 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.252119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.252133 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.355146 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.355196 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.355206 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.355226 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.355237 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.423032 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.423124 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:00 crc kubenswrapper[4666]: E1203 12:14:00.423601 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:00 crc kubenswrapper[4666]: E1203 12:14:00.423704 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.458680 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.458720 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.458728 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.458744 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.458753 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.561028 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.561082 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.561120 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.561145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.561163 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.664550 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.664595 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.664607 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.664627 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.664640 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.713802 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" event={"ID":"bea0ec2c-aed9-4ff3-9f36-48d3106926b5","Type":"ContainerStarted","Data":"9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.713854 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.727544 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.742746 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.756470 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.767455 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.767493 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.767505 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.767523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.767535 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.770197 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.780351 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.805032 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.823747 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.843168 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.858555 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.870753 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.870792 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.870804 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.870823 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.870838 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.886280 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.899718 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.921164 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.939026 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.954131 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.974327 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.974377 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.974388 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.974408 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:00 crc kubenswrapper[4666]: I1203 12:14:00.974420 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:00Z","lastTransitionTime":"2025-12-03T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.079205 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.079247 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.079259 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.079280 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.079293 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.180541 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.182695 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.182728 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.182743 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.182760 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.182772 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.289669 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.289735 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.289748 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.289768 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.289780 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.393152 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.393427 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.393496 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.393579 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.393686 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.423023 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:01 crc kubenswrapper[4666]: E1203 12:14:01.423239 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.440139 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.451814 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.468012 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.480309 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.493566 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.496258 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.496381 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.496445 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.496509 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.496581 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.509596 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.532835 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.572760 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.599286 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.599617 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.599694 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.599766 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.599825 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.604786 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.625288 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.659564 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.673480 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.686630 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.700827 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.702621 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.702654 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.702666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.702688 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.702703 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.805322 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.805386 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.805400 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.805423 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.805435 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.909068 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.909127 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.909140 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.909158 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:01 crc kubenswrapper[4666]: I1203 12:14:01.909172 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:01Z","lastTransitionTime":"2025-12-03T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.011966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.012007 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.012016 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.012034 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.012043 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.114440 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.114478 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.114487 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.114504 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.114515 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.217258 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.217308 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.217325 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.217350 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.217372 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.320452 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.320513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.320532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.320565 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.320586 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.422472 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.422516 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:02 crc kubenswrapper[4666]: E1203 12:14:02.422635 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:02 crc kubenswrapper[4666]: E1203 12:14:02.422752 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.423981 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.424018 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.424031 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.424049 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.424064 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.527771 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.527817 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.527827 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.527844 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.527860 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.630352 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.630398 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.630410 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.630428 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.630440 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.721524 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/0.log" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.724569 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989" exitCode=1 Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.724618 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.725523 4666 scope.go:117] "RemoveContainer" containerID="a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.732345 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.732381 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.732395 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.732412 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.732426 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.742889 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.759651 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.770665 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.785741 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.799162 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.818621 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.834082 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.835423 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.835457 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.835467 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.835486 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.835499 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.850857 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.862780 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.875596 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.889891 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.902825 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.914837 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.933401 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:02Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.070915 5924 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071131 5924 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071231 5924 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071390 5924 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071552 5924 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071750 5924 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071922 5924 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.938472 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.938514 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.938528 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.938546 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:02 crc kubenswrapper[4666]: I1203 12:14:02.938561 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:02Z","lastTransitionTime":"2025-12-03T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.041310 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.041403 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.041423 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.041450 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.041469 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.143512 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.143571 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.143580 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.143596 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.143606 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.247454 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.247518 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.247533 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.247557 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.247573 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.350666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.350715 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.350727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.350746 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.350758 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.423637 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:03 crc kubenswrapper[4666]: E1203 12:14:03.423821 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.454033 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.454107 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.454123 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.454144 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.454156 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.556786 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.556821 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.556829 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.556847 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.556857 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.659159 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.659209 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.659221 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.659241 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.659254 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.730175 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/0.log" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.732298 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.732852 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.745606 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.757502 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.761536 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.761593 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.761608 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.761636 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.761654 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.772641 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.788316 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.803803 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.820945 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.836542 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.849950 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.864419 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.864477 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.864492 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.864517 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.864530 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.867560 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.895254 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:02Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.070915 5924 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071131 5924 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071231 5924 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071390 5924 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071552 5924 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071750 5924 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071922 5924 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.909665 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.931056 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.948840 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.962734 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.968162 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.968223 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.968236 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.968256 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:03 crc kubenswrapper[4666]: I1203 12:14:03.968555 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:03Z","lastTransitionTime":"2025-12-03T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.070935 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.071235 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.071317 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.071379 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.071433 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.106181 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j"] Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.107181 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.110950 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.112072 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.128695 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.148525 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:02Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.070915 5924 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071131 5924 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071231 5924 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071390 5924 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071552 5924 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071750 5924 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071922 5924 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.163854 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.174261 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.174348 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.174369 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.174399 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.174419 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.180849 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.199013 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.209138 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6kt\" (UniqueName: \"kubernetes.io/projected/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-kube-api-access-bc6kt\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.209311 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.209485 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.209613 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.217879 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.230362 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.245292 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.260424 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.272528 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.277821 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.277870 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.277880 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.277899 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.277911 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.287429 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.298163 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.310560 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.311580 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.311611 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.311666 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6kt\" (UniqueName: \"kubernetes.io/projected/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-kube-api-access-bc6kt\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.312282 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.312372 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.312715 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.317055 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.330828 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6kt\" (UniqueName: \"kubernetes.io/projected/ccad3bb3-5e47-4dcc-a6e0-830378bde2ce-kube-api-access-bc6kt\") pod \"ovnkube-control-plane-749d76644c-cp56j\" (UID: \"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.333062 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.350307 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.379961 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.380000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.380009 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.380025 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.380036 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.420060 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.423197 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.423214 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.423368 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.423440 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.483257 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.483301 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.483311 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.483331 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.483341 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.586696 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.586744 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.586756 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.586778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.586791 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.688997 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.689042 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.689053 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.689072 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.689109 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.706401 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.706444 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.706456 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.706477 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.706488 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.719987 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.724636 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.724681 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.724690 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.724708 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.724723 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.736872 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.740733 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" event={"ID":"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce","Type":"ContainerStarted","Data":"b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.740807 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" event={"ID":"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce","Type":"ContainerStarted","Data":"81d9bc99f89a066f5544576aaecf03cf529924603e705b4d44a534fab8c0c07f"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.741831 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.741870 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.741887 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.741908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.741926 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.754707 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.759203 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.759244 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.759257 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.759278 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.759293 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.771255 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.774651 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.774763 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.774827 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.774889 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.774956 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.784773 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/1.log" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.785762 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/0.log" Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.786755 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.786896 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.790308 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08" exitCode=1 Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.790363 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.790527 4666 scope.go:117] "RemoveContainer" containerID="a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.793055 4666 scope.go:117] "RemoveContainer" containerID="61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.794515 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.794588 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.794603 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.794627 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.794646 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: E1203 12:14:04.795715 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.808817 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.822038 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.833809 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.848787 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.863780 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.879656 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.893242 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.897265 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.897302 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.897315 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.897332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.897344 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:04Z","lastTransitionTime":"2025-12-03T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.914190 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:02Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.070915 5924 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071131 5924 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071231 5924 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071390 5924 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071552 5924 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071750 5924 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071922 5924 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.926291 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.938422 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.953349 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.970844 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.983155 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:04 crc kubenswrapper[4666]: I1203 12:14:04.998395 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.001903 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.001937 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.001949 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.001968 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.001979 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.013205 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.104978 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.105029 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.105039 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.105056 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.105066 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.203004 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-s4f78"] Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.203732 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:05 crc kubenswrapper[4666]: E1203 12:14:05.203817 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.207533 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.207582 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.207594 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.207611 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.207627 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.219774 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6mxm\" (UniqueName: \"kubernetes.io/projected/1889fa0a-c57e-4b03-884b-f096236b084b-kube-api-access-g6mxm\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.219817 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.227596 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.242416 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.257917 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.282256 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:02Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.070915 5924 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071131 5924 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071231 5924 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071390 5924 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071552 5924 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071750 5924 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071922 5924 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.297005 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.310263 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.310318 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.310330 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.310350 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.310363 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.312478 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.320305 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6mxm\" (UniqueName: \"kubernetes.io/projected/1889fa0a-c57e-4b03-884b-f096236b084b-kube-api-access-g6mxm\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.320363 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:05 crc kubenswrapper[4666]: E1203 12:14:05.320559 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:05 crc kubenswrapper[4666]: E1203 12:14:05.320631 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:05.82060904 +0000 UTC m=+34.665570101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.325415 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.336968 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6mxm\" (UniqueName: \"kubernetes.io/projected/1889fa0a-c57e-4b03-884b-f096236b084b-kube-api-access-g6mxm\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.340103 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.350982 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.363028 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.377123 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.389668 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.406022 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.412499 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.412564 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.412574 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.412591 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.412608 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.422851 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:05 crc kubenswrapper[4666]: E1203 12:14:05.422987 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.424255 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.445874 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.462956 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.515874 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.515929 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.515943 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.515965 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.515981 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.619352 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.619399 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.619409 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.619429 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.619441 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.721308 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.721341 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.721350 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.721366 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.721377 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.797285 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" event={"ID":"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce","Type":"ContainerStarted","Data":"7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.799293 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/1.log" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.804034 4666 scope.go:117] "RemoveContainer" containerID="61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08" Dec 03 12:14:05 crc kubenswrapper[4666]: E1203 12:14:05.804236 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.816671 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.824050 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.824143 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.824158 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.824185 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.824201 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.825621 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:05 crc kubenswrapper[4666]: E1203 12:14:05.826455 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:05 crc kubenswrapper[4666]: E1203 12:14:05.826639 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:06.826600019 +0000 UTC m=+35.671561250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.840003 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.855333 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.869270 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.881478 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.894521 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.907077 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.923175 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.926782 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.926849 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.926864 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.926888 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.926904 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:05Z","lastTransitionTime":"2025-12-03T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.939955 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.957381 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.976768 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a63c8ad1b0c43025106ffc29070ddff62251bf631088a17719a4d73810ea9989\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:02Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.070915 5924 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071131 5924 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071231 5924 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 12:14:02.071390 5924 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071552 5924 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 12:14:02.071750 5924 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 12:14:02.071922 5924 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:05 crc kubenswrapper[4666]: I1203 12:14:05.991442 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.007371 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.022202 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.029161 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.029197 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.029208 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.029228 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.029241 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.036956 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.051510 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.066614 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.079868 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.094959 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.113881 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.127957 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.132134 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.132181 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.132190 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.132208 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.132221 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.145911 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.162785 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.183177 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.201717 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.219664 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.233047 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.234776 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.234848 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.234866 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.234891 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.234907 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.258768 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.274944 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.291763 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.306281 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.321488 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.337079 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.337146 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.337160 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.337177 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.337189 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.423053 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.423143 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.423193 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:06 crc kubenswrapper[4666]: E1203 12:14:06.423244 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:06 crc kubenswrapper[4666]: E1203 12:14:06.423535 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:06 crc kubenswrapper[4666]: E1203 12:14:06.423684 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.439679 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.439714 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.439724 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.439742 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.439755 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.542703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.542766 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.542784 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.542810 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.542830 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.645344 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.645615 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.645728 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.645802 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.645875 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.749735 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.749803 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.749822 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.749854 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.749875 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.834672 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:06 crc kubenswrapper[4666]: E1203 12:14:06.836002 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:06 crc kubenswrapper[4666]: E1203 12:14:06.836152 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:08.836120159 +0000 UTC m=+37.681081370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.853770 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.853836 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.853865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.853914 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.853941 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.957277 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.957339 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.957352 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.957372 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:06 crc kubenswrapper[4666]: I1203 12:14:06.957385 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:06Z","lastTransitionTime":"2025-12-03T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.060580 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.060685 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.060697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.060716 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.060728 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.107467 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.127969 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.141629 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.156164 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.163326 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.163362 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.163380 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.163399 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.163412 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.171343 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.193351 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.207033 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.220527 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.234065 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.240407 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.240545 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.240587 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.240615 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.240660 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.240828 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.240856 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.240873 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.240926 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:23.24090713 +0000 UTC m=+52.085868191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241268 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241292 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241305 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241334 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241378 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241352 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:23.241320811 +0000 UTC m=+52.086282012 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241556 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:23.241530827 +0000 UTC m=+52.086491878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241580 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:23.241574338 +0000 UTC m=+52.086535389 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.241600 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:23.241594818 +0000 UTC m=+52.086555869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.245869 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.262366 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.266218 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.266270 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.266281 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.266302 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.266315 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.274321 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.285359 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.296695 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.309532 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.323576 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.335352 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.368922 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.368968 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.368978 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.369000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.369013 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.423642 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:07 crc kubenswrapper[4666]: E1203 12:14:07.423818 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.471948 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.472010 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.472022 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.472044 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.472058 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.575017 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.575119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.575145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.575180 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.575203 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.677907 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.677971 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.677990 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.678015 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.678034 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.781660 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.781703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.781714 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.781732 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.781743 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.884891 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.884936 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.884948 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.884964 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.884974 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.987811 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.987887 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.987896 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.987911 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:07 crc kubenswrapper[4666]: I1203 12:14:07.987922 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:07Z","lastTransitionTime":"2025-12-03T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.090726 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.090764 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.090772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.090788 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.090799 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.193776 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.193831 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.193840 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.193860 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.193871 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.297508 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.297847 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.298138 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.298236 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.298318 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.401609 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.401678 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.401703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.401741 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.401766 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.422809 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.422855 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.422946 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:08 crc kubenswrapper[4666]: E1203 12:14:08.423058 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:08 crc kubenswrapper[4666]: E1203 12:14:08.423232 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:08 crc kubenswrapper[4666]: E1203 12:14:08.423363 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.504638 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.504734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.504769 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.504807 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.504835 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.608236 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.608285 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.608293 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.608310 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.608321 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.711628 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.711697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.711712 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.711734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.711745 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.814999 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.815054 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.815068 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.815110 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.815122 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.857029 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:08 crc kubenswrapper[4666]: E1203 12:14:08.857303 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:08 crc kubenswrapper[4666]: E1203 12:14:08.857370 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:12.857349668 +0000 UTC m=+41.702310709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.918041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.918143 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.918162 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.918193 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:08 crc kubenswrapper[4666]: I1203 12:14:08.918218 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:08Z","lastTransitionTime":"2025-12-03T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.022060 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.022141 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.022157 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.022180 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.022192 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.135675 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.135747 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.135772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.135809 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.135839 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.239200 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.239548 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.239644 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.239737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.239826 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.343227 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.343645 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.343906 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.344014 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.344134 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.423458 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:09 crc kubenswrapper[4666]: E1203 12:14:09.424245 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.447460 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.447542 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.447557 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.447575 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.447588 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.550049 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.550152 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.550175 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.550207 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.550233 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.653530 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.653831 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.653966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.654083 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.654216 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.757767 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.757867 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.757884 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.757918 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.757941 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.861354 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.861416 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.861435 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.861467 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.861487 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.964346 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.964387 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.964394 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.964410 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:09 crc kubenswrapper[4666]: I1203 12:14:09.964419 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:09Z","lastTransitionTime":"2025-12-03T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.067920 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.067973 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.067989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.068011 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.068030 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.172020 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.172132 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.172160 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.172193 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.172215 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.275013 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.275150 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.275178 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.275213 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.275328 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.378434 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.378911 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.378978 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.379023 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.379051 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.423362 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.423362 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.423362 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:10 crc kubenswrapper[4666]: E1203 12:14:10.423991 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:10 crc kubenswrapper[4666]: E1203 12:14:10.424080 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:10 crc kubenswrapper[4666]: E1203 12:14:10.424192 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.482233 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.482609 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.482738 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.482862 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.482952 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.586755 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.586829 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.586849 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.586881 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.586901 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.689853 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.689904 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.689917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.689939 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.689952 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.793392 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.793439 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.793450 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.793473 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.793484 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.896054 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.896128 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.896143 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.896166 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.896177 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.998887 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.998959 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.998979 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.999033 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:10 crc kubenswrapper[4666]: I1203 12:14:10.999054 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:10Z","lastTransitionTime":"2025-12-03T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.103460 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.103519 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.103529 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.103550 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.103561 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.206534 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.206582 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.206592 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.206610 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.206621 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.310709 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.310772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.310782 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.310802 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.310815 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.413614 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.413663 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.413674 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.413692 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.413708 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.423169 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:11 crc kubenswrapper[4666]: E1203 12:14:11.423292 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.446144 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.462432 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.477785 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.494136 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.510547 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.516789 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.516832 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.516843 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.516861 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.516874 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.524099 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.533979 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.548597 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.565838 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.577316 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.588272 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.601525 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.616570 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.619464 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.619605 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.619622 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.619642 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.619654 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.632723 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.649067 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.673372 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.722459 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.722516 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.722529 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.722555 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.722571 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.829475 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.829521 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.829536 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.829565 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.829581 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.932824 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.932863 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.932872 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.932888 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:11 crc kubenswrapper[4666]: I1203 12:14:11.932899 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:11Z","lastTransitionTime":"2025-12-03T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.038068 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.038127 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.038136 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.038154 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.038165 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.140906 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.141027 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.141041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.141127 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.141145 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.244724 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.244778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.244794 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.244813 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.244828 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.347036 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.347149 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.347166 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.347186 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.347200 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.423542 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.423648 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:12 crc kubenswrapper[4666]: E1203 12:14:12.423703 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.423656 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:12 crc kubenswrapper[4666]: E1203 12:14:12.423838 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:12 crc kubenswrapper[4666]: E1203 12:14:12.423965 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.450645 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.450692 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.450705 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.450723 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.450738 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.554001 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.554053 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.554066 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.554110 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.554130 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.658404 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.658466 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.658479 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.658502 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.658519 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.762058 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.762155 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.762172 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.762200 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.762218 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.865547 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.865640 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.865661 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.865692 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.865713 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.905628 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:12 crc kubenswrapper[4666]: E1203 12:14:12.905853 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:12 crc kubenswrapper[4666]: E1203 12:14:12.905926 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:20.905906063 +0000 UTC m=+49.750867114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.968941 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.968986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.968998 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.969011 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:12 crc kubenswrapper[4666]: I1203 12:14:12.969021 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:12Z","lastTransitionTime":"2025-12-03T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.071273 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.071325 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.071341 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.071359 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.071372 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.173745 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.173794 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.173805 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.173822 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.173836 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.276637 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.276712 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.276736 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.276764 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.276782 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.379203 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.379261 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.379269 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.379287 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.379298 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.423395 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:13 crc kubenswrapper[4666]: E1203 12:14:13.423610 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.482150 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.482202 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.482214 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.482235 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.482245 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.585458 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.585513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.585523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.585539 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.585551 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.688281 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.688326 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.688335 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.688350 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.688359 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.792298 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.792355 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.792371 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.792390 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.792404 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.895328 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.895389 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.895403 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.895420 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.895433 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.998239 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.998293 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.998302 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.998320 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:13 crc kubenswrapper[4666]: I1203 12:14:13.998334 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:13Z","lastTransitionTime":"2025-12-03T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.100508 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.100585 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.100609 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.100637 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.100656 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.202963 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.203014 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.203025 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.203041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.203052 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.305908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.305950 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.305960 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.305976 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.305985 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.408460 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.408537 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.408551 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.408570 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.408582 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.422747 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.422796 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.422757 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.422879 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.422988 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.423081 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.511040 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.511079 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.511126 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.511145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.511158 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.614041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.614157 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.614177 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.614199 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.614212 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.717980 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.718040 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.718053 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.718074 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.718116 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.821166 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.821207 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.821218 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.821236 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.821250 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.867920 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.867967 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.867980 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.868001 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.868015 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.882560 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.888644 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.888722 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.888737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.889121 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.889149 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.903493 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.908696 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.908749 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.908767 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.908789 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.908805 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.923185 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.928448 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.928501 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.928520 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.928547 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.928566 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.944009 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.948643 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.948690 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.948703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.948727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.948740 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.966628 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:14 crc kubenswrapper[4666]: E1203 12:14:14.966806 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.969332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.969396 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.969415 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.969441 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:14 crc kubenswrapper[4666]: I1203 12:14:14.969461 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:14Z","lastTransitionTime":"2025-12-03T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.072192 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.072240 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.072251 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.072268 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.072281 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.174836 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.174874 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.174886 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.174902 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.174913 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.277741 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.277799 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.277816 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.277838 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.277856 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.381724 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.381822 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.381840 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.381865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.381883 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.423640 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:15 crc kubenswrapper[4666]: E1203 12:14:15.423871 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.485354 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.485408 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.485417 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.485436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.485447 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.587873 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.587933 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.587946 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.587966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.587979 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.690697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.690740 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.690751 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.690774 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.690796 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.793883 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.793951 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.793966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.793987 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.793998 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.898246 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.898455 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.898474 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.898500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:15 crc kubenswrapper[4666]: I1203 12:14:15.898520 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:15Z","lastTransitionTime":"2025-12-03T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.001422 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.001461 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.001470 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.001485 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.001495 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.103922 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.103962 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.103973 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.103989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.104003 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.206291 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.206345 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.206355 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.206370 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.206381 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.308978 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.309018 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.309027 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.309042 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.309052 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.412241 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.412282 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.412297 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.412316 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.412329 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.423477 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.423540 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:16 crc kubenswrapper[4666]: E1203 12:14:16.423626 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.423762 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:16 crc kubenswrapper[4666]: E1203 12:14:16.424309 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.424724 4666 scope.go:117] "RemoveContainer" containerID="61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08" Dec 03 12:14:16 crc kubenswrapper[4666]: E1203 12:14:16.424725 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.515614 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.515674 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.515692 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.515721 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.515780 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.620373 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.620472 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.620496 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.620523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.620559 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.723925 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.723979 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.723989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.724006 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.724018 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.827444 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.827532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.827544 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.827565 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.827577 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.930884 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.931698 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.931841 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.931964 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:16 crc kubenswrapper[4666]: I1203 12:14:16.932053 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:16Z","lastTransitionTime":"2025-12-03T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.035718 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.035781 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.035796 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.035819 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.035836 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.145440 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.145485 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.145495 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.145512 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.145523 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.248034 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.248079 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.248140 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.248189 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.248205 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.350858 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.350908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.350920 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.350941 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.350955 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.422766 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:17 crc kubenswrapper[4666]: E1203 12:14:17.422938 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.453941 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.454013 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.454027 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.454048 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.454062 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.556899 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.556949 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.556963 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.556986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.557001 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.660721 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.660806 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.660822 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.660842 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.660855 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.764066 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.764307 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.764378 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.764441 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.764538 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.853371 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/1.log" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.859368 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.860603 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.868592 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.868657 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.868674 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.868696 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.868711 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.886229 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.917729 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.934757 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.955061 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.972560 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.972629 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.972645 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.972670 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.972684 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:17Z","lastTransitionTime":"2025-12-03T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:17 crc kubenswrapper[4666]: I1203 12:14:17.980232 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.002841 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.019205 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.033438 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.051306 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.068844 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.075998 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.076066 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.076119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.076145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.076167 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.081062 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.094408 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.109450 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.125124 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.139164 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.160507 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.179304 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.179596 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.179713 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.179783 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.179859 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.290224 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.290283 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.290295 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.290315 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.290329 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.392577 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.392670 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.392694 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.392726 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.392744 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.423441 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.423526 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.423442 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:18 crc kubenswrapper[4666]: E1203 12:14:18.423653 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:18 crc kubenswrapper[4666]: E1203 12:14:18.423917 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:18 crc kubenswrapper[4666]: E1203 12:14:18.424125 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.495873 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.495915 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.495928 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.495949 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.495961 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.599282 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.599340 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.599357 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.599384 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.599405 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.702494 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.702563 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.702578 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.702600 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.702614 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.804874 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.804923 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.804935 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.804955 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.804967 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.908721 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.908810 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.908826 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.908854 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:18 crc kubenswrapper[4666]: I1203 12:14:18.908873 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:18Z","lastTransitionTime":"2025-12-03T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.012036 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.012102 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.012112 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.012134 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.012145 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.115582 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.115634 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.115646 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.115669 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.115682 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.218624 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.218692 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.218703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.218722 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.218732 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.321872 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.321908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.321917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.321934 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.321944 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.423262 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:19 crc kubenswrapper[4666]: E1203 12:14:19.423504 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.425704 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.425745 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.425758 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.425775 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.425788 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.528750 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.528798 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.528815 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.528835 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.528847 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.632251 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.632336 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.632355 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.632377 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.632393 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.735670 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.735716 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.735729 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.735746 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.735759 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.838611 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.838650 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.838660 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.838678 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.838696 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.868932 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/2.log" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.869894 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/1.log" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.873555 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268" exitCode=1 Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.873615 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.873667 4666 scope.go:117] "RemoveContainer" containerID="61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.874908 4666 scope.go:117] "RemoveContainer" containerID="798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268" Dec 03 12:14:19 crc kubenswrapper[4666]: E1203 12:14:19.875294 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.892775 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.907972 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.935030 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.941002 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.941039 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.941048 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.941063 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.941105 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:19Z","lastTransitionTime":"2025-12-03T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.947458 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.961171 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.974683 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:19 crc kubenswrapper[4666]: I1203 12:14:19.987668 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.001308 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:19Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.019366 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.033333 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.043252 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.043294 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.043308 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.043328 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.043341 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.047737 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.059179 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.071330 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.083455 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.099513 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.118474 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:20Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.145899 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.146194 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.146299 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.146406 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.146510 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.249339 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.249392 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.249406 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.249426 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.249438 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.352759 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.352841 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.352864 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.352893 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.352912 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.423177 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:20 crc kubenswrapper[4666]: E1203 12:14:20.423817 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.423376 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.423245 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:20 crc kubenswrapper[4666]: E1203 12:14:20.424176 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:20 crc kubenswrapper[4666]: E1203 12:14:20.424422 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.456833 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.456889 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.456900 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.456919 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.456932 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.560829 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.561425 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.561667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.561861 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.562041 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.664907 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.664957 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.664967 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.664986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.664998 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.768621 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.768702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.768720 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.768751 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.768773 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.872303 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.872466 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.872557 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.872650 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.872744 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.882984 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/2.log" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.976202 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.976301 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.976319 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.976342 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:20 crc kubenswrapper[4666]: I1203 12:14:20.976362 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:20Z","lastTransitionTime":"2025-12-03T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.002856 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:21 crc kubenswrapper[4666]: E1203 12:14:21.003080 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:21 crc kubenswrapper[4666]: E1203 12:14:21.003246 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:37.003216301 +0000 UTC m=+65.848177362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.080590 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.080667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.080681 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.080708 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.080723 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.184420 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.184521 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.184540 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.184600 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.184620 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.289181 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.289252 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.289274 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.289394 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.289415 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.392893 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.392961 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.392977 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.393007 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.393024 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.422919 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:21 crc kubenswrapper[4666]: E1203 12:14:21.423225 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.441012 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.465687 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.481596 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.496261 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.497092 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.497176 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.497192 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.497212 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.497225 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.512021 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.527339 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.548449 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.566687 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.581723 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.599128 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.600155 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.600262 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.600329 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.600398 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.600469 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.614985 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.627764 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.642735 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.658431 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.673061 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.688883 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.703363 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.703420 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.703429 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.703445 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.703458 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.807010 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.807061 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.807074 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.807125 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.807142 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.909676 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.909724 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.909737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.909755 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:21 crc kubenswrapper[4666]: I1203 12:14:21.909769 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:21Z","lastTransitionTime":"2025-12-03T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.011649 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.011693 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.011702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.011722 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.011731 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.114175 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.114247 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.114267 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.114287 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.114299 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.217499 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.217546 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.217555 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.217575 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.217587 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.320567 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.320612 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.320620 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.320634 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.320644 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.422493 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.422521 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.422571 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:22 crc kubenswrapper[4666]: E1203 12:14:22.422649 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:22 crc kubenswrapper[4666]: E1203 12:14:22.422803 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:22 crc kubenswrapper[4666]: E1203 12:14:22.422906 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.423440 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.423490 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.423500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.423522 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.423533 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.526898 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.526967 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.526981 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.527005 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.527019 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.630608 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.630650 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.630667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.630683 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.630694 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.733813 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.733873 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.733884 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.733904 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.733915 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.837045 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.837217 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.837245 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.837277 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.837300 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.940765 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.940808 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.940817 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.940837 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:22 crc kubenswrapper[4666]: I1203 12:14:22.940848 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:22Z","lastTransitionTime":"2025-12-03T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.043836 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.043902 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.043915 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.043935 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.043949 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.147938 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.148002 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.148020 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.148046 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.148065 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.253419 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.253704 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.253768 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.253879 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.253943 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.333235 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.333549 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.333642 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.333709 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.333798 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.333868 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.333923 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.333942 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.333992 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334087 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334225 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334031 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:55.333999197 +0000 UTC m=+84.178960448 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334365 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:55.334329206 +0000 UTC m=+84.179290457 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334393 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:14:55.334377897 +0000 UTC m=+84.179339198 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334146 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334465 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:55.334456669 +0000 UTC m=+84.179417960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334261 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.334530 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:14:55.334523411 +0000 UTC m=+84.179484662 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.356581 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.356902 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.357002 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.357119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.357211 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.423671 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:23 crc kubenswrapper[4666]: E1203 12:14:23.423894 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.460307 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.460358 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.460370 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.460388 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.460399 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.563403 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.563966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.563989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.564016 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.564035 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.666191 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.666243 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.666255 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.666273 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.666286 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.768937 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.768978 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.768988 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.769004 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.769014 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.871555 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.871634 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.871652 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.871679 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.871699 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.975973 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.976066 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.976156 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.976193 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:23 crc kubenswrapper[4666]: I1203 12:14:23.976217 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:23Z","lastTransitionTime":"2025-12-03T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.078986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.079049 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.079060 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.079078 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.079196 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.181928 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.181979 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.181989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.182009 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.182020 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.285604 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.285666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.285681 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.285701 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.285717 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.388528 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.388584 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.388606 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.388631 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.388646 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.422743 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.422769 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.422743 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:24 crc kubenswrapper[4666]: E1203 12:14:24.422924 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:24 crc kubenswrapper[4666]: E1203 12:14:24.422987 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:24 crc kubenswrapper[4666]: E1203 12:14:24.423118 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.492032 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.492108 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.492118 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.492133 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.492143 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.595331 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.595407 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.595434 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.595475 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.595501 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.699775 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.700123 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.700222 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.700335 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.700436 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.730653 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.750723 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.750864 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.767863 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.789502 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.803987 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.804052 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.804070 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.804126 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.804149 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.817336 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.846212 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.868399 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.882287 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.899063 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.906207 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.906250 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.906259 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.906277 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.906288 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:24Z","lastTransitionTime":"2025-12-03T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.920322 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.934280 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.948875 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.962801 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.979642 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:24 crc kubenswrapper[4666]: I1203 12:14:24.991828 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.003784 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.008279 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.008327 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.008338 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.008357 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.008368 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.021032 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.110952 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.111004 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.111013 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.111031 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.111043 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.213811 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.213879 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.213896 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.213920 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.213940 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.316731 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.316842 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.316865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.316894 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.316912 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.368604 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.368675 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.368694 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.368727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.368752 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: E1203 12:14:25.383460 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.389608 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.389667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.389678 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.389704 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.389715 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: E1203 12:14:25.407639 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.412180 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.412224 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.412238 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.412257 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.412269 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.422856 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:25 crc kubenswrapper[4666]: E1203 12:14:25.423055 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:25 crc kubenswrapper[4666]: E1203 12:14:25.426001 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.430635 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.430674 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.430687 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.430706 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.430718 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: E1203 12:14:25.443575 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.447905 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.447970 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.447981 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.448001 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.448013 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: E1203 12:14:25.463304 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:25 crc kubenswrapper[4666]: E1203 12:14:25.463432 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.465378 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.465436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.465452 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.465481 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.465499 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.568284 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.568332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.568343 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.568360 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.568373 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.672358 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.672420 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.672430 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.672449 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.672461 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.775507 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.775808 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.775890 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.775959 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.776034 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.880011 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.880088 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.880124 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.880152 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.880175 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.984029 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.984147 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.984174 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.984204 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:25 crc kubenswrapper[4666]: I1203 12:14:25.984222 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:25Z","lastTransitionTime":"2025-12-03T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.087727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.087785 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.087806 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.087830 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.087845 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.191719 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.191793 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.191811 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.191839 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.191858 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.294527 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.294595 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.294605 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.294644 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.294659 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.397947 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.398040 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.398054 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.398075 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.398114 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.422576 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.422649 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.422625 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:26 crc kubenswrapper[4666]: E1203 12:14:26.422764 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:26 crc kubenswrapper[4666]: E1203 12:14:26.422888 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:26 crc kubenswrapper[4666]: E1203 12:14:26.422973 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.501313 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.501368 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.501386 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.501406 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.501419 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.604071 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.604153 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.604162 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.604181 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.604193 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.707223 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.707273 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.707283 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.707302 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.707314 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.810628 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.810926 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.810938 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.810960 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.810975 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.912702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.912746 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.912757 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.912773 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:26 crc kubenswrapper[4666]: I1203 12:14:26.912783 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:26Z","lastTransitionTime":"2025-12-03T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.016745 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.016810 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.016827 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.016846 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.016859 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.119856 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.119918 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.119932 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.119957 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.119976 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.222337 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.222390 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.222402 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.222422 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.222437 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.324970 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.325043 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.325060 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.325119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.325138 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.423174 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:27 crc kubenswrapper[4666]: E1203 12:14:27.423343 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.427137 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.427171 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.427180 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.427196 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.427207 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.530870 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.530931 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.530956 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.530976 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.530989 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.633930 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.633971 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.633980 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.633995 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.634008 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.736778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.736845 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.736868 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.736893 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.736908 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.839303 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.839391 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.839413 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.839445 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.839469 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.942735 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.942787 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.942797 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.942842 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:27 crc kubenswrapper[4666]: I1203 12:14:27.942856 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:27Z","lastTransitionTime":"2025-12-03T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.046184 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.046255 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.046267 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.046308 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.046320 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.149673 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.149729 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.149744 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.149761 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.149773 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.252702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.252763 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.252775 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.252794 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.252810 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.357064 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.357220 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.357245 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.357283 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.357309 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.422856 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.422923 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.422856 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:28 crc kubenswrapper[4666]: E1203 12:14:28.423146 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:28 crc kubenswrapper[4666]: E1203 12:14:28.423215 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:28 crc kubenswrapper[4666]: E1203 12:14:28.423332 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.466844 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.466918 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.466937 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.466965 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.466987 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.570288 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.570356 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.570385 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.570413 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.570427 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.673343 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.673389 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.673402 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.673421 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.673436 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.776518 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.776578 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.776592 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.776611 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.776624 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.879902 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.879968 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.879984 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.880007 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.880021 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.982646 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.982710 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.982723 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.982746 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:28 crc kubenswrapper[4666]: I1203 12:14:28.982762 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:28Z","lastTransitionTime":"2025-12-03T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.086000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.086047 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.086058 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.086085 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.086125 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.188893 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.188952 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.188964 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.188981 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.188993 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.291610 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.291656 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.291667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.291687 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.291700 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.395451 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.396371 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.396398 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.396423 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.396439 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.423508 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:29 crc kubenswrapper[4666]: E1203 12:14:29.423790 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.499983 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.500047 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.500058 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.500076 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.500115 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.603007 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.603057 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.603065 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.603080 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.603123 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.706209 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.706282 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.706300 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.706321 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.706338 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.809357 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.809398 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.809410 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.809427 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.809441 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.912060 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.912122 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.912134 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.912153 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:29 crc kubenswrapper[4666]: I1203 12:14:29.912167 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:29Z","lastTransitionTime":"2025-12-03T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.014786 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.014830 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.014844 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.014861 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.014874 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.117338 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.117394 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.117407 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.117426 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.117440 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.220396 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.220454 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.220464 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.220485 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.220496 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.324066 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.324139 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.324151 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.324170 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.324184 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.423456 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.423574 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:30 crc kubenswrapper[4666]: E1203 12:14:30.423726 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.423750 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:30 crc kubenswrapper[4666]: E1203 12:14:30.423860 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:30 crc kubenswrapper[4666]: E1203 12:14:30.424182 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.426322 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.426356 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.426374 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.426396 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.426412 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.529843 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.529907 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.529921 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.529944 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.529960 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.633779 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.633866 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.633885 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.633919 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.633942 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.737572 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.737703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.737723 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.737758 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.737776 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.840163 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.840244 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.840263 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.840291 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.840310 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.944008 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.944046 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.944067 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.944085 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:30 crc kubenswrapper[4666]: I1203 12:14:30.944116 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:30Z","lastTransitionTime":"2025-12-03T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.047597 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.047646 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.047657 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.047674 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.047682 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.150040 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.150105 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.150121 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.150140 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.150154 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.253593 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.253652 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.253664 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.253686 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.253696 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.356563 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.356622 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.356636 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.356655 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.356670 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.422841 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:31 crc kubenswrapper[4666]: E1203 12:14:31.423010 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.437825 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.450878 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.459824 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.459873 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.459883 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.459900 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.459913 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.466647 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.481966 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.498793 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.516197 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.535947 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.549272 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.561850 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.561894 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.561906 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.561923 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.561932 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.565257 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.579435 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.603830 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61b04b43db7458d1a254a52b9ce0a43de269aa542453ce255456a6ebd01e8d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"message\\\":\\\"ics/network-check-target-xd92c in node crc\\\\nI1203 12:14:04.406191 6099 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406117 6099 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-wbdks in node crc\\\\nI1203 12:14:04.406199 6099 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1203 12:14:04.406199 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-wbdks after 0 failed attempt(s)\\\\nI1203 12:14:04.406203 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1203 12:14:04.406205 6099 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-wbdks\\\\nI1203 12:14:04.406205 6099 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1203 12:14:04.406208 6099 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 12:14:04.406218 6099 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 12:14:04.406222 6099 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.619224 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.633815 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.651161 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.663702 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.665260 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.665409 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.665426 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.665453 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.665466 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.675817 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.690917 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.768633 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.768671 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.768685 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.768702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.768715 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.871138 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.871186 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.871194 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.871212 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.871224 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.975550 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.975624 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.975637 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.975655 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:31 crc kubenswrapper[4666]: I1203 12:14:31.975671 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:31Z","lastTransitionTime":"2025-12-03T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.078809 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.078844 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.078853 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.078869 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.078879 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.181585 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.181850 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.181913 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.181987 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.182074 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.285865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.286036 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.286059 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.286128 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.286151 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.389058 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.389159 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.389174 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.389194 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.389207 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.422957 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.423010 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.423159 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:32 crc kubenswrapper[4666]: E1203 12:14:32.423169 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:32 crc kubenswrapper[4666]: E1203 12:14:32.423300 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:32 crc kubenswrapper[4666]: E1203 12:14:32.423564 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.492563 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.492646 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.492671 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.492706 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.492732 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.595421 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.595490 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.595506 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.595529 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.595542 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.700277 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.700318 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.700329 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.700349 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.700362 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.803493 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.803532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.803541 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.803558 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.803569 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.905875 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.905929 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.905943 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.905962 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:32 crc kubenswrapper[4666]: I1203 12:14:32.905976 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:32Z","lastTransitionTime":"2025-12-03T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.008595 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.008678 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.008689 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.008704 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.008715 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.111916 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.111994 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.112011 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.112039 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.112058 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.215282 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.215333 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.215342 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.215359 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.215368 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.318304 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.318378 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.318392 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.318411 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.318423 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.420898 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.420973 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.420994 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.421012 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.421021 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.423518 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:33 crc kubenswrapper[4666]: E1203 12:14:33.423894 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.424212 4666 scope.go:117] "RemoveContainer" containerID="798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268" Dec 03 12:14:33 crc kubenswrapper[4666]: E1203 12:14:33.424499 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.439655 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.456183 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.477858 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.492721 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.509618 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.523893 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.523931 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.523945 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.523983 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.523996 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.526014 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.543078 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.557200 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.572956 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.588830 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.601016 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.612777 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.625296 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.626861 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.626907 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.626917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.626933 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.626960 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.639787 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.652281 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.663854 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.675008 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.729912 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.730359 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.730514 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.730654 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.730826 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.833723 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.833788 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.833806 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.833829 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.833846 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.936397 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.936457 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.936469 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.936490 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:33 crc kubenswrapper[4666]: I1203 12:14:33.936503 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:33Z","lastTransitionTime":"2025-12-03T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.039655 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.039710 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.039722 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.039742 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.039756 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.142865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.142967 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.142988 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.143049 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.143075 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.247635 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.248125 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.248295 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.248443 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.248590 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.351709 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.351767 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.351779 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.351801 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.351814 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.422794 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.422831 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.422860 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:34 crc kubenswrapper[4666]: E1203 12:14:34.423855 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:34 crc kubenswrapper[4666]: E1203 12:14:34.423978 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:34 crc kubenswrapper[4666]: E1203 12:14:34.424063 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.455067 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.455135 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.455145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.455163 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.455173 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.558145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.558190 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.558199 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.558220 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.558230 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.661432 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.661479 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.661490 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.661508 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.661519 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.764643 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.764703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.764713 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.764733 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.764743 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.867238 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.867285 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.867297 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.867319 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.867331 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.969696 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.969761 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.969773 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.969791 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:34 crc kubenswrapper[4666]: I1203 12:14:34.969805 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:34Z","lastTransitionTime":"2025-12-03T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.072457 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.072506 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.072517 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.072534 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.072546 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.175676 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.175737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.175746 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.175764 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.175773 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.278675 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.278727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.278740 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.278791 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.278806 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.381331 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.381392 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.381407 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.381430 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.381445 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.423468 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:35 crc kubenswrapper[4666]: E1203 12:14:35.423666 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.484405 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.484457 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.484466 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.484482 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.484515 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.588727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.588783 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.588793 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.588813 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.588826 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.688235 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.688284 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.688295 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.688318 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.688329 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: E1203 12:14:35.701313 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.705458 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.705503 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.705513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.705532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.705543 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: E1203 12:14:35.717848 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.721908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.721958 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.721968 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.721984 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.721998 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: E1203 12:14:35.738138 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.742609 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.742674 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.742685 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.742703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.742718 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: E1203 12:14:35.758546 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.764007 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.764080 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.764123 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.764145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.764160 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: E1203 12:14:35.782282 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:35Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:35 crc kubenswrapper[4666]: E1203 12:14:35.782456 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.784640 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.784690 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.784701 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.784719 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.784730 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.887811 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.887859 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.887868 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.887884 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.887894 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.990673 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.990756 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.990777 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.990804 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:35 crc kubenswrapper[4666]: I1203 12:14:35.990822 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:35Z","lastTransitionTime":"2025-12-03T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.093888 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.093928 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.093936 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.093955 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.093966 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.196978 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.197030 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.197040 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.197057 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.197069 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.300801 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.300855 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.300874 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.300898 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.300916 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.403596 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.403660 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.403671 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.403689 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.403701 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.423213 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.423261 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.423213 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:36 crc kubenswrapper[4666]: E1203 12:14:36.423371 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:36 crc kubenswrapper[4666]: E1203 12:14:36.423454 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:36 crc kubenswrapper[4666]: E1203 12:14:36.423516 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.507002 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.507052 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.507111 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.507134 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.507146 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.609910 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.609966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.609977 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.609994 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.610004 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.712627 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.712697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.712709 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.712728 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.712740 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.815971 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.816017 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.816029 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.816051 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.816064 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.918956 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.919010 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.919021 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.919041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:36 crc kubenswrapper[4666]: I1203 12:14:36.919053 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:36Z","lastTransitionTime":"2025-12-03T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.021257 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.021328 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.021338 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.021359 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.021370 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.086945 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:37 crc kubenswrapper[4666]: E1203 12:14:37.087122 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:37 crc kubenswrapper[4666]: E1203 12:14:37.087201 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:09.087176267 +0000 UTC m=+97.932137338 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.124108 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.124151 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.124162 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.124180 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.124192 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.226926 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.226989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.227006 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.227033 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.227122 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.329794 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.329831 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.329840 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.329855 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.329866 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.423429 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:37 crc kubenswrapper[4666]: E1203 12:14:37.423617 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.431513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.431554 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.431569 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.431594 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.431606 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.535373 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.535443 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.535456 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.535481 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.535495 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.639344 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.639432 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.639457 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.639488 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.639509 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.743783 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.743848 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.743860 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.743881 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.743930 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.846840 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.846898 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.846925 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.846949 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.846964 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.949523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.949562 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.949573 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.949587 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:37 crc kubenswrapper[4666]: I1203 12:14:37.949598 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:37Z","lastTransitionTime":"2025-12-03T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.052945 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.053025 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.053041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.053063 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.053083 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.155950 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.156018 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.156027 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.156043 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.156054 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.259062 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.259152 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.259162 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.259180 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.259225 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.362268 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.362328 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.362338 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.362356 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.362366 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.422815 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.422893 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.422836 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:38 crc kubenswrapper[4666]: E1203 12:14:38.423020 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:38 crc kubenswrapper[4666]: E1203 12:14:38.423372 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:38 crc kubenswrapper[4666]: E1203 12:14:38.423220 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.465625 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.465675 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.465701 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.465720 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.465732 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.568615 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.568661 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.568675 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.568696 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.568709 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.671567 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.671613 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.671622 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.671639 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.671648 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.774761 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.774812 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.774831 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.774856 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.774871 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.878795 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.878843 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.878853 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.878922 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.878934 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.982630 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.983402 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.983463 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.983494 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:38 crc kubenswrapper[4666]: I1203 12:14:38.983508 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:38Z","lastTransitionTime":"2025-12-03T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.086832 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.086900 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.086917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.086946 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.086960 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.190204 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.190250 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.190258 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.190277 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.190288 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.293130 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.293195 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.293210 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.293252 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.293269 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.396674 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.396986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.397111 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.397203 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.397262 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.423411 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:39 crc kubenswrapper[4666]: E1203 12:14:39.423618 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.500133 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.500549 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.500617 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.500688 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.500766 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.603736 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.604110 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.604210 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.604286 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.604387 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.707734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.707779 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.707791 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.707811 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.707821 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.810212 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.810280 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.810298 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.810327 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.810345 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.913684 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.913765 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.913782 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.913804 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.913822 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:39Z","lastTransitionTime":"2025-12-03T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.948210 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/0.log" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.948286 4666 generic.go:334] "Generic (PLEG): container finished" podID="ba134276-4c96-4ba6-b18f-276b312a7355" containerID="7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584" exitCode=1 Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.948330 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerDied","Data":"7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584"} Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.948853 4666 scope.go:117] "RemoveContainer" containerID="7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.969661 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:39Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:39 crc kubenswrapper[4666]: I1203 12:14:39.987632 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:39Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.005556 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.017434 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.017558 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.017594 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.017614 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.017639 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.025542 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.043243 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.059376 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.076240 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.120885 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.120947 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.120962 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.120988 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.121002 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.122618 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.153536 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.171417 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.188522 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.207058 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.225131 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.225182 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.225195 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.225217 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.225231 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.227350 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.250866 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.267866 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.285690 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.306471 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.328475 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.328523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.328537 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.328557 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.328566 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.423282 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.423405 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:40 crc kubenswrapper[4666]: E1203 12:14:40.423448 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:40 crc kubenswrapper[4666]: E1203 12:14:40.423625 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.423282 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:40 crc kubenswrapper[4666]: E1203 12:14:40.423730 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.432915 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.433330 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.433501 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.433641 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.433760 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.537234 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.537314 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.537332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.537354 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.537367 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.640511 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.640968 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.641151 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.641370 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.641529 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.744254 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.744301 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.744312 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.744326 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.744337 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.846939 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.847007 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.847019 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.847039 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.847050 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.950404 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.950725 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.950858 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.950950 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.951034 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:40Z","lastTransitionTime":"2025-12-03T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.953288 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/0.log" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.953356 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerStarted","Data":"8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd"} Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.971821 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:40 crc kubenswrapper[4666]: I1203 12:14:40.988771 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:40Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.003021 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.017915 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.033218 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.048033 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.053384 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.053444 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.053464 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.053489 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.053505 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.061510 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.074269 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.086723 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.101452 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.114053 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.126234 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.143387 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.155737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.155774 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.155784 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.155802 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.155814 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.159701 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.177007 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.200005 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.228690 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.257659 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.257698 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.257708 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.257726 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.257735 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.360960 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.361008 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.361022 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.361043 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.361057 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.422624 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:41 crc kubenswrapper[4666]: E1203 12:14:41.422788 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.439634 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.452835 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.464005 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.464076 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.464117 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.464147 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.464166 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.472437 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.492381 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.509179 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.524261 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.540287 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.558431 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.566588 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.566946 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.567118 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.567277 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.567401 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.575915 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.596907 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.613954 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.629828 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.642785 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.657551 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.670451 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.670499 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.670513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.670531 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.670542 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.671102 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.683976 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.697732 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:41Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.773319 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.773357 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.773366 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.773382 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.773392 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.876491 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.876531 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.876539 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.876557 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.876566 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.978423 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.978450 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.978458 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.978476 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:41 crc kubenswrapper[4666]: I1203 12:14:41.978486 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:41Z","lastTransitionTime":"2025-12-03T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.081373 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.081411 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.081420 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.081436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.081444 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.184314 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.184386 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.184407 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.184440 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.184467 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.286813 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.287119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.287255 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.287341 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.287416 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.390432 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.390476 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.390485 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.390503 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.390514 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.422723 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.422827 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:42 crc kubenswrapper[4666]: E1203 12:14:42.422866 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.422850 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:42 crc kubenswrapper[4666]: E1203 12:14:42.423020 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:42 crc kubenswrapper[4666]: E1203 12:14:42.423049 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.492913 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.492954 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.492963 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.492979 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.492989 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.596044 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.596127 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.596138 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.596159 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.596172 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.699314 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.699353 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.699362 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.699379 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.699394 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.801845 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.801892 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.801903 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.801921 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.801933 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.904671 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.904729 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.904739 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.904755 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:42 crc kubenswrapper[4666]: I1203 12:14:42.904763 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:42Z","lastTransitionTime":"2025-12-03T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.007389 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.007435 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.007445 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.007464 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.007474 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.110144 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.110193 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.110207 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.110226 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.110237 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.212734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.212882 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.212894 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.212919 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.212932 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.316329 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.316425 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.316464 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.316500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.316519 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.419006 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.419056 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.419068 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.419106 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.419121 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.422651 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:43 crc kubenswrapper[4666]: E1203 12:14:43.422885 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.522728 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.522775 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.522788 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.522808 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.522821 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.626484 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.626524 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.626538 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.626559 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.626573 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.729573 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.729622 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.729638 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.729667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.729682 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.832817 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.832900 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.832915 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.832936 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.832950 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.936123 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.936178 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.936191 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.936212 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:43 crc kubenswrapper[4666]: I1203 12:14:43.936226 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:43Z","lastTransitionTime":"2025-12-03T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.039391 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.039447 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.039460 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.039477 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.039487 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.141941 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.142013 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.142038 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.142071 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.142108 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.245341 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.245393 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.245405 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.245425 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.245442 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.348624 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.348688 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.348705 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.348729 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.348746 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.423041 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.423080 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.423041 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:44 crc kubenswrapper[4666]: E1203 12:14:44.423216 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:44 crc kubenswrapper[4666]: E1203 12:14:44.423292 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:44 crc kubenswrapper[4666]: E1203 12:14:44.423374 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.451202 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.451734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.451917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.452054 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.452227 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.555700 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.556170 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.556299 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.556409 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.556518 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.659268 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.659729 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.659934 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.660177 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.660379 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.764192 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.764250 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.764264 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.764286 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.764301 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.867511 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.867903 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.868167 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.868375 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.868555 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.972354 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.972402 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.972426 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.972455 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:44 crc kubenswrapper[4666]: I1203 12:14:44.972472 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:44Z","lastTransitionTime":"2025-12-03T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.075002 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.075045 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.075084 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.075117 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.075130 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.178054 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.178629 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.178720 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.178838 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.178910 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.281855 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.281904 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.281912 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.281933 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.281945 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.384459 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.384526 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.384551 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.384586 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.384610 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.422735 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:45 crc kubenswrapper[4666]: E1203 12:14:45.422980 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.487069 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.487162 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.487181 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.487203 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.487222 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.590435 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.590848 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.590929 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.590974 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.591002 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.695024 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.695079 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.695109 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.695131 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.695146 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.797633 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.797680 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.797689 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.797706 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.797714 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.900335 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.900413 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.900431 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.900453 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.900467 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.914815 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.914849 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.914860 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.914880 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.914898 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: E1203 12:14:45.927986 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.934372 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.934421 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.934434 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.934451 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.934462 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: E1203 12:14:45.949060 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.954029 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.954118 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.954145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.954177 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.954196 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: E1203 12:14:45.975926 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.982224 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.982281 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.982294 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.982315 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:45 crc kubenswrapper[4666]: I1203 12:14:45.982328 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:45Z","lastTransitionTime":"2025-12-03T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:45 crc kubenswrapper[4666]: E1203 12:14:45.996521 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:45Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.001323 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.001366 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.001379 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.001395 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.001407 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: E1203 12:14:46.014144 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:46 crc kubenswrapper[4666]: E1203 12:14:46.014269 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.016308 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.016374 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.016385 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.016424 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.016438 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.120563 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.120629 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.120644 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.120667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.120676 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.223506 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.223578 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.223594 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.223620 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.223637 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.327019 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.327067 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.327078 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.327146 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.327161 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.423507 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.423575 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.423608 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:46 crc kubenswrapper[4666]: E1203 12:14:46.423739 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:46 crc kubenswrapper[4666]: E1203 12:14:46.423925 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:46 crc kubenswrapper[4666]: E1203 12:14:46.424177 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.429751 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.429814 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.429833 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.429857 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.429883 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.532329 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.532693 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.532774 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.532853 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.532925 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.635711 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.636035 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.636141 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.636222 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.636313 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.739583 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.739625 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.739635 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.739656 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.739669 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.842493 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.842545 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.842555 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.842572 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.842582 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.944957 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.945009 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.945021 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.945043 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:46 crc kubenswrapper[4666]: I1203 12:14:46.945058 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:46Z","lastTransitionTime":"2025-12-03T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.047607 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.047657 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.047676 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.047698 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.047709 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.150538 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.150595 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.150608 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.150630 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.150645 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.253590 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.253637 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.253649 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.253668 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.253682 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.357252 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.357312 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.357324 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.357342 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.357355 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.423417 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:47 crc kubenswrapper[4666]: E1203 12:14:47.423609 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.460433 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.460491 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.460509 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.460533 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.460550 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.563271 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.563319 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.563356 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.563377 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.563389 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.666688 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.666739 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.666748 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.666764 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.666774 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.769488 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.769538 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.769547 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.769566 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.769580 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.872140 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.872416 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.872478 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.872572 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.872640 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.975932 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.975968 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.975979 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.975995 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:47 crc kubenswrapper[4666]: I1203 12:14:47.976004 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:47Z","lastTransitionTime":"2025-12-03T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.077981 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.078038 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.078050 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.078071 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.078118 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.180954 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.181006 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.181019 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.181042 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.181056 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.288572 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.288632 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.288663 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.288698 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.288719 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.391214 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.391272 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.391284 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.391308 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.391324 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.422969 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.423025 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.422998 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:48 crc kubenswrapper[4666]: E1203 12:14:48.423185 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:48 crc kubenswrapper[4666]: E1203 12:14:48.423242 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:48 crc kubenswrapper[4666]: E1203 12:14:48.423333 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.424235 4666 scope.go:117] "RemoveContainer" containerID="798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.438470 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.494961 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.495229 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.495305 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.495378 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.495480 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.599916 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.600002 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.600032 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.600058 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.600071 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.704019 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.704100 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.704111 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.704129 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.704140 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.807135 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.807183 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.807192 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.807209 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.807218 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.910447 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.911172 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.911266 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.911358 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:48 crc kubenswrapper[4666]: I1203 12:14:48.911402 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:48Z","lastTransitionTime":"2025-12-03T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.014731 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.014777 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.014790 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.014811 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.014828 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.118529 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.118611 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.118634 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.118664 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.118684 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.221695 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.221798 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.221821 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.221854 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.221876 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.325185 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.325238 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.325255 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.325283 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.325300 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.423626 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:49 crc kubenswrapper[4666]: E1203 12:14:49.423932 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.428789 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.428872 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.428898 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.428933 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.428956 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.533186 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.533303 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.533325 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.533352 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.533372 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.636748 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.636845 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.636865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.636917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.636936 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.739941 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.739992 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.740003 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.740019 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.740030 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.843330 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.843379 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.843388 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.843407 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.843418 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.946355 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.946433 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.946445 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.946467 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.946481 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:49Z","lastTransitionTime":"2025-12-03T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.986740 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/2.log" Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.989272 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26"} Dec 03 12:14:49 crc kubenswrapper[4666]: I1203 12:14:49.989764 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.048425 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.048459 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.048467 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.048481 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.048491 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.088828 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.104293 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.133855 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.147881 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.150713 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.150758 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.150767 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.150783 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.150796 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.161611 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.176973 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.192049 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.215582 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.239907 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.253712 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.253767 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.253777 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.253801 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.253815 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.255847 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.273031 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.292163 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.303933 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.320848 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.340431 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8074a16-93db-44c2-af96-80e45c8f9d22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db16d7efd060579c2c112c7ef17479c767cf7eb656936d78e8ee13a5213e0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.356792 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.356863 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.356880 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.356902 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.356916 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.367316 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.394832 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.409605 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:50Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.422983 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:50 crc kubenswrapper[4666]: E1203 12:14:50.423124 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.423303 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:50 crc kubenswrapper[4666]: E1203 12:14:50.423351 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.423457 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:50 crc kubenswrapper[4666]: E1203 12:14:50.423512 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.459690 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.459722 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.459732 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.459749 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.459759 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.562122 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.562595 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.562685 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.562767 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.562834 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.665296 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.665346 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.665355 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.665373 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.665386 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.768947 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.769337 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.769479 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.769618 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.769743 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.873681 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.873758 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.873778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.873812 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.873835 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.977116 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.977179 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.977196 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.977220 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:50 crc kubenswrapper[4666]: I1203 12:14:50.977235 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:50Z","lastTransitionTime":"2025-12-03T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.080981 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.081061 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.081082 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.081187 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.081256 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.184617 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.184677 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.184694 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.184718 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.184730 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.288712 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.288778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.288793 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.288815 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.288830 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.393248 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.393329 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.393350 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.393467 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.393491 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.423046 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:51 crc kubenswrapper[4666]: E1203 12:14:51.423405 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.443720 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.464652 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.481060 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.496623 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.496976 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.497003 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.497014 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.497032 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.497047 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.513197 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.534406 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.549236 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.568187 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.582635 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.598556 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.599950 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.600005 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.600018 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.600038 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.600053 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.614210 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.640860 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.655505 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.671931 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.692198 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.703713 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.703752 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.703764 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.703784 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.703800 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.707874 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.724137 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.738624 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8074a16-93db-44c2-af96-80e45c8f9d22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db16d7efd060579c2c112c7ef17479c767cf7eb656936d78e8ee13a5213e0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:51Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.806896 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.806976 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.807019 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.807056 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.807081 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.910375 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.910449 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.910471 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.910504 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.910531 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:51Z","lastTransitionTime":"2025-12-03T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:51 crc kubenswrapper[4666]: I1203 12:14:51.999250 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/3.log" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.000343 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/2.log" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.004162 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26" exitCode=1 Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.004242 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.004362 4666 scope.go:117] "RemoveContainer" containerID="798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.005219 4666 scope.go:117] "RemoveContainer" containerID="801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26" Dec 03 12:14:52 crc kubenswrapper[4666]: E1203 12:14:52.005399 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.016273 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.016327 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.016348 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.016376 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.016395 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.032699 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.055155 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.080436 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.107450 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.119264 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.119313 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.119327 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.119348 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.119362 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.122768 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.139665 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.154891 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.174030 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.192351 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.214094 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:51Z\\\",\\\"message\\\":\\\"ck:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:51.108959 6594 obj_retry.go:551] Creating *factory.egressNode crc took: 3.97498ms\\\\nI1203 12:14:51.108992 6594 factory.go:1336] Added *v1.Node event handler 7\\\\nI1203 12:14:51.109039 6594 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1203 12:14:51.109114 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}\\\\nI1203 12:14:51.109135 6594 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 1.305286ms\\\\nI1203 12:14:51.109313 6594 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 12:14:51.109347 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}\\\\nI1203 12:14:51.109363 6594 services_controller.go:360] Finished syncing service router-internal-default on namespace openshift-ingress for network=default : 1.404219ms\\\\nI1203 12:14:51.109426 6594 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 12:14:51.109477 6594 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:51.109508 6594 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:51.109576 6594 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.221945 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.222004 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.222014 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.222035 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.222046 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.227886 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8074a16-93db-44c2-af96-80e45c8f9d22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db16d7efd060579c2c112c7ef17479c767cf7eb656936d78e8ee13a5213e0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.242433 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.258104 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.271714 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.287045 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.300701 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.315115 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.324557 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.324606 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.324618 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.324636 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.324647 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.332091 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.422796 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.422889 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.422836 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:52 crc kubenswrapper[4666]: E1203 12:14:52.423000 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:52 crc kubenswrapper[4666]: E1203 12:14:52.423231 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:52 crc kubenswrapper[4666]: E1203 12:14:52.423383 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.427435 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.427773 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.427933 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.428102 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.428296 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.538415 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.538534 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.538548 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.538573 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.538590 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.641766 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.642119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.642296 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.642390 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.642482 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.745866 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.745904 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.745917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.745935 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.745945 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.848729 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.848806 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.848828 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.848866 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.848893 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.953133 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.953185 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.953197 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.953214 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:52 crc kubenswrapper[4666]: I1203 12:14:52.953226 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:52Z","lastTransitionTime":"2025-12-03T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.009692 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/3.log" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.056790 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.056863 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.056881 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.056910 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.056930 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.160458 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.160526 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.160556 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.160588 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.160607 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.264189 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.264248 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.264265 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.264292 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.264314 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.367555 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.367612 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.367630 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.367656 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.367677 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.422958 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:53 crc kubenswrapper[4666]: E1203 12:14:53.423203 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.471158 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.471216 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.471233 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.471257 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.471274 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.575623 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.575687 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.575709 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.575732 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.575748 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.679249 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.679302 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.679314 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.679334 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.679344 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.782435 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.782494 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.782507 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.782530 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.782546 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.885579 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.885638 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.885660 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.885691 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.885704 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.988348 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.988400 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.988415 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.988437 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:53 crc kubenswrapper[4666]: I1203 12:14:53.988452 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:53Z","lastTransitionTime":"2025-12-03T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.091332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.091421 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.091441 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.091466 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.091484 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.194746 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.194806 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.194824 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.194854 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.194873 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.297336 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.297379 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.297389 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.297403 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.297413 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.401007 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.401149 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.401185 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.401217 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.401237 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.423326 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.423432 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.423494 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:54 crc kubenswrapper[4666]: E1203 12:14:54.423576 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:54 crc kubenswrapper[4666]: E1203 12:14:54.423688 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:54 crc kubenswrapper[4666]: E1203 12:14:54.423953 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.505046 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.505145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.505166 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.505191 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.505209 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.608001 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.608140 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.608161 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.608195 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.608213 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.711478 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.711558 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.711571 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.711594 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.711609 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.815177 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.815214 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.815224 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.815242 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.815252 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.918640 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.918721 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.918748 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.918781 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:54 crc kubenswrapper[4666]: I1203 12:14:54.918801 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:54Z","lastTransitionTime":"2025-12-03T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.021463 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.021983 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.022219 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.022425 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.022659 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.127273 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.127332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.127346 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.127369 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.127385 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.231871 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.231918 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.231929 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.231966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.231980 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.335439 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.335490 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.335503 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.335522 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.335559 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.423491 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.423670 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.423679 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.423861 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.423934 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.423913457 +0000 UTC m=+148.268874508 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.423963 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.423996 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.424023 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424049 4666 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424134 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424151 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424162 4666 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424176 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.424150554 +0000 UTC m=+148.269111645 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424209 4666 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424212 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.424196575 +0000 UTC m=+148.269157666 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424240 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.424224786 +0000 UTC m=+148.269185877 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424265 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424275 4666 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424282 4666 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:55 crc kubenswrapper[4666]: E1203 12:14:55.424303 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.424295488 +0000 UTC m=+148.269256539 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.438281 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.438368 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.438423 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.438457 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.438523 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.541389 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.541424 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.541435 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.541452 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.541463 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.644375 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.644408 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.644421 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.644436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.644447 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.747960 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.748077 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.748132 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.748166 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.748193 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.851035 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.851133 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.851148 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.851688 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.851731 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.955133 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.955499 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.955622 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.955752 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:55 crc kubenswrapper[4666]: I1203 12:14:55.955852 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:55Z","lastTransitionTime":"2025-12-03T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.058844 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.059210 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.059303 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.059371 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.059435 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.162491 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.162532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.162544 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.162564 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.162577 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.237245 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.237313 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.237331 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.237357 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.237375 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.261217 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.267939 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.267998 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.268010 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.268033 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.268047 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.291991 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.299800 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.299850 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.299870 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.299898 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.299916 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.324825 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.334261 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.334332 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.334356 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.334391 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.334420 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.359060 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.365489 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.365555 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.365572 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.365602 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.365623 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.383841 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.384082 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.386532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.386579 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.386588 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.386606 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.386616 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.422625 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.422666 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.422716 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.422766 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.423139 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:56 crc kubenswrapper[4666]: E1203 12:14:56.423277 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.442965 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.489278 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.489333 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.489348 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.489371 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.489386 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.591908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.591949 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.591958 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.591974 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.591983 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.694852 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.694912 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.694924 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.694945 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.694958 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.798377 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.798466 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.798475 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.798490 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.798500 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.901428 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.901472 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.901484 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.901500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:56 crc kubenswrapper[4666]: I1203 12:14:56.901511 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:56Z","lastTransitionTime":"2025-12-03T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.004919 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.004988 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.005012 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.005046 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.005070 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.107961 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.108006 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.108016 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.108033 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.108045 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.211015 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.211064 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.211075 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.211108 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.211119 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.313943 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.314003 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.314014 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.314030 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.314041 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.417132 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.417182 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.417193 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.417212 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.417223 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.423581 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:57 crc kubenswrapper[4666]: E1203 12:14:57.423709 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.520066 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.520144 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.520161 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.520181 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.520193 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.623448 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.623491 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.623502 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.623518 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.623528 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.726771 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.726804 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.726815 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.726831 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.726840 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.829583 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.829641 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.829655 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.829677 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.829688 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.932537 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.932626 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.932652 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.932693 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:57 crc kubenswrapper[4666]: I1203 12:14:57.932716 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:57Z","lastTransitionTime":"2025-12-03T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.036109 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.036184 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.036205 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.036233 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.036249 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.139000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.139366 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.139474 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.139557 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.139620 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.241770 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.242040 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.242133 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.242205 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.242272 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.346134 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.346205 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.346219 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.346245 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.346261 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.423239 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.423261 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:14:58 crc kubenswrapper[4666]: E1203 12:14:58.423456 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.423261 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:14:58 crc kubenswrapper[4666]: E1203 12:14:58.423602 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:14:58 crc kubenswrapper[4666]: E1203 12:14:58.423712 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.450071 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.450508 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.450686 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.450842 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.450981 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.553120 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.553175 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.553187 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.553210 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.553229 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.655903 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.656275 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.656353 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.656424 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.656542 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.759844 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.760139 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.760197 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.760231 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.760241 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.862304 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.862343 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.862351 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.862366 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.862377 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.964781 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.964833 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.964842 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.964861 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:58 crc kubenswrapper[4666]: I1203 12:14:58.964871 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:58Z","lastTransitionTime":"2025-12-03T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.068534 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.068614 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.068626 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.068643 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.068661 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.172548 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.172591 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.172600 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.172615 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.172626 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.275580 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.275658 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.275670 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.275689 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.275728 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.378904 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.378955 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.378969 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.378992 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.379010 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.422975 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:14:59 crc kubenswrapper[4666]: E1203 12:14:59.423186 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.481937 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.481990 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.482001 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.482024 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.482036 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.584671 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.584706 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.584716 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.584731 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.584739 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.687467 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.687507 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.687516 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.687532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.687541 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.789881 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.789929 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.789941 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.789959 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.789972 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.892944 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.893271 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.893337 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.893415 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.893484 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.996770 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.996843 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.996861 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.996889 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:14:59 crc kubenswrapper[4666]: I1203 12:14:59.996908 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:14:59Z","lastTransitionTime":"2025-12-03T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.099691 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.099745 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.099761 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.099783 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.099796 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.202901 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.202933 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.202941 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.202957 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.202966 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.306308 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.306401 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.306438 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.306474 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.306503 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.409063 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.409125 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.409145 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.409192 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.409204 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.423548 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.423548 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:00 crc kubenswrapper[4666]: E1203 12:15:00.423718 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.423576 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:00 crc kubenswrapper[4666]: E1203 12:15:00.423963 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:00 crc kubenswrapper[4666]: E1203 12:15:00.423992 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.512219 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.512262 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.512274 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.512291 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.512303 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.614789 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.614823 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.614834 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.614880 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.614890 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.717624 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.717687 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.717699 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.717723 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.717737 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.820539 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.820595 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.820606 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.820625 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.820637 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.923442 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.923492 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.923504 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.923526 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:00 crc kubenswrapper[4666]: I1203 12:15:00.923544 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:00Z","lastTransitionTime":"2025-12-03T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.026830 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.026870 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.026880 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.026897 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.026907 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.129301 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.129352 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.129368 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.129394 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.129411 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.231854 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.231894 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.231905 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.231924 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.231935 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.334235 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.334594 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.334685 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.334802 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.334920 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.422725 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:01 crc kubenswrapper[4666]: E1203 12:15:01.422938 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.435922 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.437032 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.437081 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.437114 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.437138 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.437156 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.452391 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.465479 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.483725 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.497752 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.516960 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.532065 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.540633 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.540683 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.540699 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.540720 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.540734 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.545040 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.569695 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.587231 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.602561 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.616559 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.638929 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798e840cbbf5bc39f331cce1d3d2fd5b87b14b01b4fbb01a448827ad3e32b268\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:19Z\\\",\\\"message\\\":\\\"r:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708634 6300 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 12:14:17.708575 6300 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:17.708704 6300 factory.go:656] Stopping watch factory\\\\nI1203 12:14:17.708721 6300 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:17.708750 6300 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:17.708824 6300 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:51Z\\\",\\\"message\\\":\\\"ck:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:51.108959 6594 obj_retry.go:551] Creating *factory.egressNode crc took: 3.97498ms\\\\nI1203 12:14:51.108992 6594 factory.go:1336] Added *v1.Node event handler 7\\\\nI1203 12:14:51.109039 6594 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1203 12:14:51.109114 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}\\\\nI1203 12:14:51.109135 6594 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 1.305286ms\\\\nI1203 12:14:51.109313 6594 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 12:14:51.109347 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}\\\\nI1203 12:14:51.109363 6594 services_controller.go:360] Finished syncing service router-internal-default on namespace openshift-ingress for network=default : 1.404219ms\\\\nI1203 12:14:51.109426 6594 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 12:14:51.109477 6594 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:51.109508 6594 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:51.109576 6594 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.643618 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.643709 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.643723 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.643742 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.643756 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.650269 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8074a16-93db-44c2-af96-80e45c8f9d22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db16d7efd060579c2c112c7ef17479c767cf7eb656936d78e8ee13a5213e0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.673482 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24f47a55-eb74-459b-af96-79356b773f88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bed75ae0c029eb4a1b2c56a1efed6a8867eba9df99a868001e5beef026c6874a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52359e94945637cfe5a1b40c96bb5adfe7d99b4f2cdcc22c164f1b51b7299e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://655e3e64dcbdd148aced91acaa10b627bd23eda51dced477f3ee7b2cc74cc8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a333daff0ee07b57450db848265866622a69262e0c671b52481b5d866da63d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7554583c33429d01b442e078c6d75f9234971717fc5f99ac1d238d41ab3e6cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.690248 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.704687 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.719336 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.735952 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.746313 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.746376 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.746388 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.746409 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.746424 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.849838 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.849896 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.849908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.849931 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.849943 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.952582 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.952636 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.952654 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.952677 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:01 crc kubenswrapper[4666]: I1203 12:15:01.952692 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:01Z","lastTransitionTime":"2025-12-03T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.056143 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.056401 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.056411 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.056427 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.056436 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.159010 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.159051 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.159060 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.159078 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.159104 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.261559 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.261602 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.261614 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.261632 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.261644 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.364186 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.364231 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.364240 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.364258 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.364267 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.422852 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.422911 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:02 crc kubenswrapper[4666]: E1203 12:15:02.423017 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.422944 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:02 crc kubenswrapper[4666]: E1203 12:15:02.423157 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:02 crc kubenswrapper[4666]: E1203 12:15:02.423274 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.467131 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.467173 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.467182 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.467212 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.467221 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.569772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.569813 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.569825 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.569845 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.569858 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.673465 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.673506 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.673515 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.673532 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.673541 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.776404 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.776677 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.776743 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.776858 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.776933 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.879589 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.879650 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.879666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.879694 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.879714 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.982074 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.982349 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.982458 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.982536 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:02 crc kubenswrapper[4666]: I1203 12:15:02.982640 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:02Z","lastTransitionTime":"2025-12-03T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.086297 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.086340 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.086355 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.086375 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.086390 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.189653 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.189731 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.189745 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.189766 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.189779 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.293636 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.293706 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.293718 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.293737 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.293752 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.397569 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.397644 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.397666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.397703 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.397726 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.423241 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:03 crc kubenswrapper[4666]: E1203 12:15:03.423447 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.501503 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.501556 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.501571 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.501594 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.501607 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.604510 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.604541 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.604551 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.604566 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.604576 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.707371 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.707470 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.707500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.707531 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.707553 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.810671 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.810716 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.810727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.810742 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.810753 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.913847 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.913880 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.913888 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.913906 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:03 crc kubenswrapper[4666]: I1203 12:15:03.913915 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:03Z","lastTransitionTime":"2025-12-03T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.017673 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.017734 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.017747 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.017769 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.017788 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.121120 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.121167 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.121177 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.121195 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.121207 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.224119 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.224156 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.224165 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.224181 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.224191 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.326462 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.326522 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.326533 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.326551 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.326564 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.423462 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.423541 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.423577 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:04 crc kubenswrapper[4666]: E1203 12:15:04.423629 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:04 crc kubenswrapper[4666]: E1203 12:15:04.423684 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:04 crc kubenswrapper[4666]: E1203 12:15:04.423754 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.430142 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.430189 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.430200 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.430223 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.430238 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.532784 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.533073 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.533160 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.533228 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.533326 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.636358 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.636429 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.636446 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.636473 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.636489 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.738924 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.738983 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.739000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.739025 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.739043 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.843018 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.843157 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.843201 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.843237 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.843262 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.945965 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.946006 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.946014 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.946032 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:04 crc kubenswrapper[4666]: I1203 12:15:04.946045 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:04Z","lastTransitionTime":"2025-12-03T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.049541 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.049592 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.049605 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.049623 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.049638 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.153000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.153634 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.153812 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.154014 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.154263 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.257197 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.257304 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.257330 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.257358 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.257375 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.361316 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.361388 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.361406 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.361436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.361466 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.423298 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:05 crc kubenswrapper[4666]: E1203 12:15:05.423470 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.464530 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.464601 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.464615 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.464641 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.464659 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.568339 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.568388 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.568399 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.568419 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.568432 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.671472 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.671552 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.671563 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.671581 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.671946 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.775576 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.775673 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.775684 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.775702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.775719 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.878894 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.878957 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.878970 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.878990 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.879005 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.982447 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.982512 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.982523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.982540 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:05 crc kubenswrapper[4666]: I1203 12:15:05.982552 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:05Z","lastTransitionTime":"2025-12-03T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.085428 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.085468 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.085477 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.085493 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.085504 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.187718 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.187775 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.187793 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.187814 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.187828 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.291022 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.291144 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.291171 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.291198 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.291259 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.394193 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.394230 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.394240 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.394256 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.394267 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.422662 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.422765 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.422793 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.422909 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.423035 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.423624 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.424581 4666 scope.go:117] "RemoveContainer" containerID="801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.424868 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.444530 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.468479 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.477786 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.477832 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.477845 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.477863 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.477879 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.488789 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.498371 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.503883 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.503991 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.504024 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.504062 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.504126 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.514527 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.522946 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.529420 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.529500 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.529530 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.529554 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.529566 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.534932 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.543713 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.547533 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.547592 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.547607 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.547631 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.547647 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.559429 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:51Z\\\",\\\"message\\\":\\\"ck:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:51.108959 6594 obj_retry.go:551] Creating *factory.egressNode crc took: 3.97498ms\\\\nI1203 12:14:51.108992 6594 factory.go:1336] Added *v1.Node event handler 7\\\\nI1203 12:14:51.109039 6594 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1203 12:14:51.109114 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}\\\\nI1203 12:14:51.109135 6594 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 1.305286ms\\\\nI1203 12:14:51.109313 6594 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 12:14:51.109347 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}\\\\nI1203 12:14:51.109363 6594 services_controller.go:360] Finished syncing service router-internal-default on namespace openshift-ingress for network=default : 1.404219ms\\\\nI1203 12:14:51.109426 6594 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 12:14:51.109477 6594 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:51.109508 6594 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:51.109576 6594 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.563357 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.570367 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.570429 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.570445 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.570462 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.570474 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.573562 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8074a16-93db-44c2-af96-80e45c8f9d22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db16d7efd060579c2c112c7ef17479c767cf7eb656936d78e8ee13a5213e0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.588813 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T12:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"965dbf1a-276c-4547-879d-6d43a85ca63c\\\",\\\"systemUUID\\\":\\\"381ea1db-1c63-4f6e-af2b-f374cfb9263c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: E1203 12:15:06.588951 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.591820 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.591991 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.592015 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.592208 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.592230 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.605943 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24f47a55-eb74-459b-af96-79356b773f88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bed75ae0c029eb4a1b2c56a1efed6a8867eba9df99a868001e5beef026c6874a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52359e94945637cfe5a1b40c96bb5adfe7d99b4f2cdcc22c164f1b51b7299e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://655e3e64dcbdd148aced91acaa10b627bd23eda51dced477f3ee7b2cc74cc8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a333daff0ee07b57450db848265866622a69262e0c671b52481b5d866da63d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7554583c33429d01b442e078c6d75f9234971717fc5f99ac1d238d41ab3e6cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.621133 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.635935 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.649199 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.663436 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.676331 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.692622 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.695200 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.695266 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.695281 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.695302 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.695316 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.707206 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.721150 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.734891 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.752916 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.772046 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:06Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.798598 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.798719 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.798733 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.798752 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.798765 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.902012 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.902362 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.902416 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.902444 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:06 crc kubenswrapper[4666]: I1203 12:15:06.902465 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:06Z","lastTransitionTime":"2025-12-03T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.005568 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.005621 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.005631 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.005653 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.005668 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.109265 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.109325 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.109344 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.109370 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.109385 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.212126 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.212564 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.212654 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.212757 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.212854 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.315522 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.315597 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.315666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.315707 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.315732 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.418269 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.418313 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.418323 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.418337 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.418347 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.422903 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:07 crc kubenswrapper[4666]: E1203 12:15:07.423141 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.522436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.522513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.522528 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.522569 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.522589 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.626503 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.626583 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.626602 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.626631 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.626651 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.729784 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.729873 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.729889 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.729914 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.729928 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.833697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.833746 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.833757 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.833774 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.833788 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.936811 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.936874 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.936889 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.936910 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:07 crc kubenswrapper[4666]: I1203 12:15:07.936924 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:07Z","lastTransitionTime":"2025-12-03T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.040901 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.040966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.040977 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.041003 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.041018 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.143646 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.143689 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.143697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.143712 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.143723 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.246493 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.246550 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.246559 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.246575 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.246586 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.350370 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.350435 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.350446 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.350466 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.350479 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.422595 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.422744 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:08 crc kubenswrapper[4666]: E1203 12:15:08.422916 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.422773 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:08 crc kubenswrapper[4666]: E1203 12:15:08.423232 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:08 crc kubenswrapper[4666]: E1203 12:15:08.423257 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.453534 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.453585 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.453595 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.453617 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.453632 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.556711 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.556762 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.556772 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.556791 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.556802 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.659513 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.659560 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.659569 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.659586 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.659596 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.762802 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.762855 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.762867 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.762887 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.762898 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.865908 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.865961 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.865972 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.865990 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.866003 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.968604 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.968653 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.968662 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.968677 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:08 crc kubenswrapper[4666]: I1203 12:15:08.968687 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:08Z","lastTransitionTime":"2025-12-03T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.070706 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.070762 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.070777 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.070800 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.070813 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.173210 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.173256 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.173266 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.173285 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.173294 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.183134 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:09 crc kubenswrapper[4666]: E1203 12:15:09.183300 4666 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:09 crc kubenswrapper[4666]: E1203 12:15:09.183370 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs podName:1889fa0a-c57e-4b03-884b-f096236b084b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:13.183348319 +0000 UTC m=+162.028309370 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs") pod "network-metrics-daemon-s4f78" (UID: "1889fa0a-c57e-4b03-884b-f096236b084b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.276410 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.276508 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.276523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.276544 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.276557 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.380031 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.380079 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.380106 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.380126 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.380136 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.423154 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:09 crc kubenswrapper[4666]: E1203 12:15:09.423341 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.483328 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.483389 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.483406 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.483429 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.483442 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.586652 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.586698 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.586708 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.586728 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.586740 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.689697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.689743 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.689752 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.689768 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.689778 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.792269 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.792309 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.792318 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.792335 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.792344 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.894865 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.894907 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.894920 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.894940 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.894952 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.996989 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.997025 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.997034 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.997051 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:09 crc kubenswrapper[4666]: I1203 12:15:09.997061 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:09Z","lastTransitionTime":"2025-12-03T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.099615 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.099666 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.099678 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.099697 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.099709 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.202177 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.202216 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.202225 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.202244 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.202255 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.305847 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.305913 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.305925 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.305943 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.305955 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.409330 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.409421 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.409446 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.409484 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.409509 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.423647 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.423699 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.423652 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:10 crc kubenswrapper[4666]: E1203 12:15:10.423792 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:10 crc kubenswrapper[4666]: E1203 12:15:10.424488 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:10 crc kubenswrapper[4666]: E1203 12:15:10.424000 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.513555 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.513616 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.513633 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.513660 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.513678 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.617194 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.617276 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.617296 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.617325 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.617343 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.720905 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.720966 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.720988 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.721025 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.721051 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.824396 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.824458 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.824476 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.824502 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.824518 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.927822 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.927891 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.927917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.927962 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:10 crc kubenswrapper[4666]: I1203 12:15:10.927982 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:10Z","lastTransitionTime":"2025-12-03T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.031417 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.031479 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.031496 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.031523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.031542 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.165794 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.165864 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.165894 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.165926 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.165950 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.268977 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.269041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.269060 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.269121 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.269146 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.374364 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.374423 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.374436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.374459 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.374473 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.422762 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:11 crc kubenswrapper[4666]: E1203 12:15:11.422993 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.448270 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vzctp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99024354-6b69-4788-9f26-2f7fbef66e7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eb2cbd912fe442dfb1c87cb5785d7b3f0d28f7c3f9323e9e53c68261c8ec2ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctgq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vzctp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.470848 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782e76d3-8dbe-4c2e-952c-6a966e2c06a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1469fee7afe987ad91d03c190f59d8f40e33859364dc302399314d1e948c4218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj868\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q9g72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.477023 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.477169 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.477190 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.477221 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.477241 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.493317 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccad3bb3-5e47-4dcc-a6e0-830378bde2ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6828e5dd28253843cfb01748260c4d01a3b4eb916e0a410ce11d5bcb32df0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7593597bf3c377fe7569bbb08e3c90c36b68a1494c49cb0f64f0428516c1296b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc6kt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cp56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.515578 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca6f4f8-d709-4b86-92d8-3d902ef06f31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a86f7b9def49328f598ffe57ece61b459baaa6271a763c80cea05c3e62ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ed824ddc5cbc4829943b3a8b0b18794fe72059f90422ca6b7a186eca9913e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5969483cd334f86f8109eefc81c81c705e9084d6d5194d0d61f15c563efc5ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.537267 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"015a4927-2b6d-4aeb-83d2-1a096cf1f343\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04b11387a822e591a23824e781bb9a948cd161dbcb42fe32db29ff8323620ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14a2a2e2c76e98c9eb7065d06ba1fedd3f4b2c5213c6baa10d4cd6ee4cf72390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3d387ba37daa2567d000b9093fd4f94ee0733cabceec4e9e1b69d32a8ff14b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29f90d2e29900208386fbe19ccd79df7fd7ad1d62c957e21a8097025ac4b3dbf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.555375 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97de03618477a5fe7d408a6b91704609a2d6b0f673a9a2b7b30e7b809a1df907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.579969 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.580042 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.580062 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.580118 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.580140 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.584085 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bea0ec2c-aed9-4ff3-9f36-48d3106926b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dad1b1da3ff370698c60c444594c21487a81566824e73f5f25f889bce929caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://965a800d9bbaea40f2521732847e38c561e05fe2bbf419f86bff63422c23ff47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b31132404c7bb6df9c8373b63279403d15f3eae4e7ff44999042152735d5587\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45997c6d050f2855f90b8efc670f5b658e4588741bf2a7558cd6c9a44e630baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c01df28387f7deb6c67dfaa333793ad2288cd31269eb06cce8206d8fa1ff075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f749fcfc2a1e1cefda4cb866df8ecf54b932478c9f5ea8cde1600ae140bf517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbad98d2a3a4258ad218d9d7f77cbb541a196551f9f4f4d12380259ca9d10520\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd6kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p6hxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.606650 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c0bee8edf39ed0ed2f4f149d247c313a6bec502392cbf59db6a99d8d279d5e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.626993 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b34b28d0bf387c35c3b2e5c82452962ce7fee5fb07d2a98229b17c192a33894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecd5bec6db09b9c434b2aabd42e5639410eac0025f2bb6ef3b13b3952575aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.649786 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.665648 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.683135 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.683213 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.683225 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.683245 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.683257 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.690717 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wbdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba134276-4c96-4ba6-b18f-276b312a7355\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:39Z\\\",\\\"message\\\":\\\"2025-12-03T12:13:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1\\\\n2025-12-03T12:13:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d273e786-890e-463b-9fa1-29b2469784b1 to /host/opt/cni/bin/\\\\n2025-12-03T12:13:54Z [verbose] multus-daemon started\\\\n2025-12-03T12:13:54Z [verbose] Readiness Indicator file check\\\\n2025-12-03T12:14:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b67km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wbdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.717671 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fce11cd-ec4a-4e25-9483-21a8a45f332c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T12:14:51Z\\\",\\\"message\\\":\\\"ck:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 12:14:51.108959 6594 obj_retry.go:551] Creating *factory.egressNode crc took: 3.97498ms\\\\nI1203 12:14:51.108992 6594 factory.go:1336] Added *v1.Node event handler 7\\\\nI1203 12:14:51.109039 6594 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1203 12:14:51.109114 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}\\\\nI1203 12:14:51.109135 6594 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 1.305286ms\\\\nI1203 12:14:51.109313 6594 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 12:14:51.109347 6594 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}\\\\nI1203 12:14:51.109363 6594 services_controller.go:360] Finished syncing service router-internal-default on namespace openshift-ingress for network=default : 1.404219ms\\\\nI1203 12:14:51.109426 6594 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 12:14:51.109477 6594 ovnkube.go:599] Stopped ovnkube\\\\nI1203 12:14:51.109508 6594 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 12:14:51.109576 6594 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whx68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mh5x5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.732639 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8074a16-93db-44c2-af96-80e45c8f9d22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db16d7efd060579c2c112c7ef17479c767cf7eb656936d78e8ee13a5213e0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4bfb6a4082e0ab4941d3e6587b8b5a10ca1c79b39d9f9e1f72a65a78c0ad459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.764430 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24f47a55-eb74-459b-af96-79356b773f88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bed75ae0c029eb4a1b2c56a1efed6a8867eba9df99a868001e5beef026c6874a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52359e94945637cfe5a1b40c96bb5adfe7d99b4f2cdcc22c164f1b51b7299e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://655e3e64dcbdd148aced91acaa10b627bd23eda51dced477f3ee7b2cc74cc8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a333daff0ee07b57450db848265866622a69262e0c671b52481b5d866da63d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7554583c33429d01b442e078c6d75f9234971717fc5f99ac1d238d41ab3e6cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3561ff4d02c8bb6e0927b07751027f9851ceb43355e70558a3a49e1d2615cde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7448baf8b3f47d144b42fcd355397a1074eb85b52e9b6a8538cf1137b9ac9d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c822c15b545caf06baf6dd8e3422011646ced541ba5a7c809c09c91e117e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.783635 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65d87524-d1ce-4e3a-88fc-229830eca10d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T12:13:50Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 12:13:45.208952 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 12:13:45.209719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913206209/tls.crt::/tmp/serving-cert-1913206209/tls.key\\\\\\\"\\\\nI1203 12:13:50.480470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 12:13:50.484528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 12:13:50.484565 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 12:13:50.484596 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 12:13:50.484604 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 12:13:50.489444 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 12:13:50.489460 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 12:13:50.489488 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489495 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 12:13:50.489503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 12:13:50.489506 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 12:13:50.489513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 12:13:50.489520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 12:13:50.491877 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T12:13:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T12:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.786624 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.786671 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.786683 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.786707 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.786722 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.799468 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.814537 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4lgm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49c29a00-1d2c-4222-9f43-e125c87085c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34cf568712c25c13d325f14909f28b1d53ae1782057ea415b79da4608c49746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj7g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:13:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4lgm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.830514 4666 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s4f78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1889fa0a-c57e-4b03-884b-f096236b084b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T12:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6mxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T12:14:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s4f78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T12:15:11Z is after 2025-08-24T17:21:41Z" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.890517 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.890583 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.890602 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.890621 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.890650 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.994855 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.994914 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.994931 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.994956 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:11 crc kubenswrapper[4666]: I1203 12:15:11.994974 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:11Z","lastTransitionTime":"2025-12-03T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.099476 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.099571 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.099626 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.099667 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.099697 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.203174 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.203241 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.203264 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.203298 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.203323 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.307813 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.307883 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.307917 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.307950 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.307975 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.411045 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.411129 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.411147 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.411172 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.411192 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.423279 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.423382 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.423315 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:12 crc kubenswrapper[4666]: E1203 12:15:12.423510 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:12 crc kubenswrapper[4666]: E1203 12:15:12.423599 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:12 crc kubenswrapper[4666]: E1203 12:15:12.423802 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.514613 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.514682 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.514702 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.514727 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.514745 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.617914 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.617983 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.617995 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.618029 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.618041 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.720904 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.720940 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.720949 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.720981 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.720992 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.824932 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.824994 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.825011 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.825040 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.825058 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.928017 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.928081 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.928149 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.928186 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:12 crc kubenswrapper[4666]: I1203 12:15:12.928226 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:12Z","lastTransitionTime":"2025-12-03T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.031830 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.031877 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.031886 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.031905 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.031917 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.141628 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.141678 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.141686 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.141705 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.141714 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.244073 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.244130 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.244140 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.244157 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.244168 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.346309 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.346367 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.346379 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.346401 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.346414 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.423471 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:13 crc kubenswrapper[4666]: E1203 12:15:13.423661 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.449205 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.449265 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.449279 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.449301 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.449314 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.551932 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.551987 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.551999 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.552017 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.552029 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.655258 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.655322 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.655339 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.655367 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.655383 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.757907 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.757973 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.757986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.758011 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.758027 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.860458 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.860495 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.860503 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.860518 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.860527 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.963390 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.963436 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.963464 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.963483 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:13 crc kubenswrapper[4666]: I1203 12:15:13.963498 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:13Z","lastTransitionTime":"2025-12-03T12:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.066426 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.066485 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.066505 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.066525 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.066539 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.169820 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.169877 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.169890 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.169909 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.169919 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.273218 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.273251 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.273260 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.273276 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.273285 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.375517 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.375581 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.375591 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.375609 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.375620 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.422916 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.423003 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.423045 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:14 crc kubenswrapper[4666]: E1203 12:15:14.423082 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:14 crc kubenswrapper[4666]: E1203 12:15:14.423441 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:14 crc kubenswrapper[4666]: E1203 12:15:14.423514 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.478744 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.478797 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.478816 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.478839 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.478853 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.581992 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.582041 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.582051 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.582076 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.582103 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.686265 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.686314 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.686340 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.686364 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.686380 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.790363 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.790419 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.790431 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.790451 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.790464 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.893523 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.893572 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.893583 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.893602 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.893615 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.996220 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.996258 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.996268 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.996286 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:14 crc kubenswrapper[4666]: I1203 12:15:14.996297 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:14Z","lastTransitionTime":"2025-12-03T12:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.099063 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.099157 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.099167 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.099189 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.099201 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.201732 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.201778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.201791 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.201808 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.201820 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.304914 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.304977 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.304986 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.305006 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.305018 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.408483 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.408536 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.408546 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.408564 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.408576 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.423246 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:15 crc kubenswrapper[4666]: E1203 12:15:15.423455 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.512854 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.512940 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.512964 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.513000 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.513025 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.617538 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.617610 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.617627 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.617653 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.617672 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.720866 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.720938 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.720956 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.720983 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.721002 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.824542 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.824584 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.824599 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.824616 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.824629 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.927729 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.927778 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.927802 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.927824 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:15 crc kubenswrapper[4666]: I1203 12:15:15.927840 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:15Z","lastTransitionTime":"2025-12-03T12:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.031316 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.031371 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.031387 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.031409 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.031426 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.134577 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.134637 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.134650 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.134673 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.134689 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.237185 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.237239 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.237250 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.237275 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.237288 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.340510 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.340552 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.340560 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.340585 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.340595 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.423597 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.423692 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.423692 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:16 crc kubenswrapper[4666]: E1203 12:15:16.423883 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:16 crc kubenswrapper[4666]: E1203 12:15:16.424055 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:16 crc kubenswrapper[4666]: E1203 12:15:16.424175 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.444139 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.444204 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.444271 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.444333 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.444354 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.547952 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.548011 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.548027 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.548049 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.548067 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.616896 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.616942 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.616951 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.616969 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.616982 4666 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T12:15:16Z","lastTransitionTime":"2025-12-03T12:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.703214 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6"] Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.703864 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.706516 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.706681 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.706835 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.707996 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.748649 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.748627116 podStartE2EDuration="1m22.748627116s" podCreationTimestamp="2025-12-03 12:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.727596177 +0000 UTC m=+105.572557248" watchObservedRunningTime="2025-12-03 12:15:16.748627116 +0000 UTC m=+105.593588167" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.748948 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.748942305 podStartE2EDuration="52.748942305s" podCreationTimestamp="2025-12-03 12:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.748877743 +0000 UTC m=+105.593838814" watchObservedRunningTime="2025-12-03 12:15:16.748942305 +0000 UTC m=+105.593903356" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.798346 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p6hxn" podStartSLOduration=86.798321574 podStartE2EDuration="1m26.798321574s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.783730442 +0000 UTC m=+105.628691503" watchObservedRunningTime="2025-12-03 12:15:16.798321574 +0000 UTC m=+105.643282625" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.864683 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wbdks" podStartSLOduration=86.864663951 podStartE2EDuration="1m26.864663951s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.864415394 +0000 UTC m=+105.709376445" watchObservedRunningTime="2025-12-03 12:15:16.864663951 +0000 UTC m=+105.709625002" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.868351 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb8c70e6-a498-4537-951f-a9ca568788a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.868627 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c70e6-a498-4537-951f-a9ca568788a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.868789 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c70e6-a498-4537-951f-a9ca568788a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.868908 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bb8c70e6-a498-4537-951f-a9ca568788a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.869198 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bb8c70e6-a498-4537-951f-a9ca568788a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.933402 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=20.933378023 podStartE2EDuration="20.933378023s" podCreationTimestamp="2025-12-03 12:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.933231729 +0000 UTC m=+105.778192790" watchObservedRunningTime="2025-12-03 12:15:16.933378023 +0000 UTC m=+105.778339094" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.933774 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=28.933767984 podStartE2EDuration="28.933767984s" podCreationTimestamp="2025-12-03 12:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.897960718 +0000 UTC m=+105.742921789" watchObservedRunningTime="2025-12-03 12:15:16.933767984 +0000 UTC m=+105.778729035" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.951077 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=85.95105891 podStartE2EDuration="1m25.95105891s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.950830153 +0000 UTC m=+105.795791234" watchObservedRunningTime="2025-12-03 12:15:16.95105891 +0000 UTC m=+105.796019961" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.970639 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bb8c70e6-a498-4537-951f-a9ca568788a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.970701 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb8c70e6-a498-4537-951f-a9ca568788a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.970733 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c70e6-a498-4537-951f-a9ca568788a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.970805 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c70e6-a498-4537-951f-a9ca568788a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.970829 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bb8c70e6-a498-4537-951f-a9ca568788a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.970901 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bb8c70e6-a498-4537-951f-a9ca568788a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.970962 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bb8c70e6-a498-4537-951f-a9ca568788a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.973369 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c70e6-a498-4537-951f-a9ca568788a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.976617 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c70e6-a498-4537-951f-a9ca568788a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.981894 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4lgm9" podStartSLOduration=86.981879178 podStartE2EDuration="1m26.981879178s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:16.980287134 +0000 UTC m=+105.825248185" watchObservedRunningTime="2025-12-03 12:15:16.981879178 +0000 UTC m=+105.826840229" Dec 03 12:15:16 crc kubenswrapper[4666]: I1203 12:15:16.993677 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb8c70e6-a498-4537-951f-a9ca568788a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97bx6\" (UID: \"bb8c70e6-a498-4537-951f-a9ca568788a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:17 crc kubenswrapper[4666]: I1203 12:15:17.008841 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vzctp" podStartSLOduration=87.00881777 podStartE2EDuration="1m27.00881777s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:17.007937636 +0000 UTC m=+105.852898687" watchObservedRunningTime="2025-12-03 12:15:17.00881777 +0000 UTC m=+105.853778821" Dec 03 12:15:17 crc kubenswrapper[4666]: I1203 12:15:17.018260 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" Dec 03 12:15:17 crc kubenswrapper[4666]: I1203 12:15:17.020973 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podStartSLOduration=87.020947274 podStartE2EDuration="1m27.020947274s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:17.020282536 +0000 UTC m=+105.865243607" watchObservedRunningTime="2025-12-03 12:15:17.020947274 +0000 UTC m=+105.865908315" Dec 03 12:15:17 crc kubenswrapper[4666]: I1203 12:15:17.049514 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cp56j" podStartSLOduration=86.04949247 podStartE2EDuration="1m26.04949247s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:17.046757035 +0000 UTC m=+105.891718106" watchObservedRunningTime="2025-12-03 12:15:17.04949247 +0000 UTC m=+105.894453521" Dec 03 12:15:17 crc kubenswrapper[4666]: I1203 12:15:17.100246 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" event={"ID":"bb8c70e6-a498-4537-951f-a9ca568788a5","Type":"ContainerStarted","Data":"57c365cbccae117ae88ec1b87170676449c48f8a68366b1a35d7a08ca038234f"} Dec 03 12:15:17 crc kubenswrapper[4666]: I1203 12:15:17.423549 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:17 crc kubenswrapper[4666]: E1203 12:15:17.423719 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:18 crc kubenswrapper[4666]: I1203 12:15:18.104551 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" event={"ID":"bb8c70e6-a498-4537-951f-a9ca568788a5","Type":"ContainerStarted","Data":"372800167f99dbf52dd2d55ab9e39f003070cf8d3c3d978bdeaaafb1f96d950b"} Dec 03 12:15:18 crc kubenswrapper[4666]: I1203 12:15:18.120343 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97bx6" podStartSLOduration=88.120321224 podStartE2EDuration="1m28.120321224s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:18.120076677 +0000 UTC m=+106.965037758" watchObservedRunningTime="2025-12-03 12:15:18.120321224 +0000 UTC m=+106.965282305" Dec 03 12:15:18 crc kubenswrapper[4666]: I1203 12:15:18.422902 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:18 crc kubenswrapper[4666]: I1203 12:15:18.423009 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:18 crc kubenswrapper[4666]: I1203 12:15:18.423082 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:18 crc kubenswrapper[4666]: E1203 12:15:18.423247 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:18 crc kubenswrapper[4666]: E1203 12:15:18.423884 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:18 crc kubenswrapper[4666]: E1203 12:15:18.423963 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:18 crc kubenswrapper[4666]: I1203 12:15:18.424580 4666 scope.go:117] "RemoveContainer" containerID="801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26" Dec 03 12:15:18 crc kubenswrapper[4666]: E1203 12:15:18.424870 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:15:19 crc kubenswrapper[4666]: I1203 12:15:19.423366 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:19 crc kubenswrapper[4666]: E1203 12:15:19.423540 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:20 crc kubenswrapper[4666]: I1203 12:15:20.422711 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:20 crc kubenswrapper[4666]: I1203 12:15:20.422762 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:20 crc kubenswrapper[4666]: I1203 12:15:20.422758 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:20 crc kubenswrapper[4666]: E1203 12:15:20.422883 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:20 crc kubenswrapper[4666]: E1203 12:15:20.423010 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:20 crc kubenswrapper[4666]: E1203 12:15:20.423119 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:21 crc kubenswrapper[4666]: I1203 12:15:21.423610 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:21 crc kubenswrapper[4666]: E1203 12:15:21.425428 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:22 crc kubenswrapper[4666]: I1203 12:15:22.423361 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:22 crc kubenswrapper[4666]: I1203 12:15:22.423424 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:22 crc kubenswrapper[4666]: I1203 12:15:22.423369 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:22 crc kubenswrapper[4666]: E1203 12:15:22.423621 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:22 crc kubenswrapper[4666]: E1203 12:15:22.423800 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:22 crc kubenswrapper[4666]: E1203 12:15:22.423920 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:23 crc kubenswrapper[4666]: I1203 12:15:23.423327 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:23 crc kubenswrapper[4666]: E1203 12:15:23.423623 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:24 crc kubenswrapper[4666]: I1203 12:15:24.423555 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:24 crc kubenswrapper[4666]: I1203 12:15:24.423614 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:24 crc kubenswrapper[4666]: E1203 12:15:24.423737 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:24 crc kubenswrapper[4666]: E1203 12:15:24.423881 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:24 crc kubenswrapper[4666]: I1203 12:15:24.424152 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:24 crc kubenswrapper[4666]: E1203 12:15:24.424255 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:25 crc kubenswrapper[4666]: I1203 12:15:25.423385 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:25 crc kubenswrapper[4666]: E1203 12:15:25.424127 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:26 crc kubenswrapper[4666]: I1203 12:15:26.422578 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:26 crc kubenswrapper[4666]: I1203 12:15:26.422711 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:26 crc kubenswrapper[4666]: E1203 12:15:26.422915 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:26 crc kubenswrapper[4666]: I1203 12:15:26.422942 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:26 crc kubenswrapper[4666]: E1203 12:15:26.423039 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:26 crc kubenswrapper[4666]: E1203 12:15:26.423112 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:27 crc kubenswrapper[4666]: I1203 12:15:27.137634 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/1.log" Dec 03 12:15:27 crc kubenswrapper[4666]: I1203 12:15:27.138298 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/0.log" Dec 03 12:15:27 crc kubenswrapper[4666]: I1203 12:15:27.138354 4666 generic.go:334] "Generic (PLEG): container finished" podID="ba134276-4c96-4ba6-b18f-276b312a7355" containerID="8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd" exitCode=1 Dec 03 12:15:27 crc kubenswrapper[4666]: I1203 12:15:27.138393 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerDied","Data":"8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd"} Dec 03 12:15:27 crc kubenswrapper[4666]: I1203 12:15:27.138439 4666 scope.go:117] "RemoveContainer" containerID="7caf3f2e13d30840409f436ebbe861d8a890f6bac546c197e677f8d7e0349584" Dec 03 12:15:27 crc kubenswrapper[4666]: I1203 12:15:27.139192 4666 scope.go:117] "RemoveContainer" containerID="8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd" Dec 03 12:15:27 crc kubenswrapper[4666]: E1203 12:15:27.139552 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wbdks_openshift-multus(ba134276-4c96-4ba6-b18f-276b312a7355)\"" pod="openshift-multus/multus-wbdks" podUID="ba134276-4c96-4ba6-b18f-276b312a7355" Dec 03 12:15:27 crc kubenswrapper[4666]: I1203 12:15:27.422980 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:27 crc kubenswrapper[4666]: E1203 12:15:27.423157 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:28 crc kubenswrapper[4666]: I1203 12:15:28.144601 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/1.log" Dec 03 12:15:28 crc kubenswrapper[4666]: I1203 12:15:28.422608 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:28 crc kubenswrapper[4666]: I1203 12:15:28.422717 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:28 crc kubenswrapper[4666]: I1203 12:15:28.422823 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:28 crc kubenswrapper[4666]: E1203 12:15:28.423403 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:28 crc kubenswrapper[4666]: E1203 12:15:28.423407 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:28 crc kubenswrapper[4666]: E1203 12:15:28.423717 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:29 crc kubenswrapper[4666]: I1203 12:15:29.423028 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:29 crc kubenswrapper[4666]: E1203 12:15:29.423255 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:30 crc kubenswrapper[4666]: I1203 12:15:30.423078 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:30 crc kubenswrapper[4666]: E1203 12:15:30.423276 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:30 crc kubenswrapper[4666]: I1203 12:15:30.423349 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:30 crc kubenswrapper[4666]: I1203 12:15:30.423349 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:30 crc kubenswrapper[4666]: E1203 12:15:30.423651 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:30 crc kubenswrapper[4666]: E1203 12:15:30.423886 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:30 crc kubenswrapper[4666]: I1203 12:15:30.424747 4666 scope.go:117] "RemoveContainer" containerID="801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26" Dec 03 12:15:30 crc kubenswrapper[4666]: E1203 12:15:30.424937 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mh5x5_openshift-ovn-kubernetes(6fce11cd-ec4a-4e25-9483-21a8a45f332c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" Dec 03 12:15:31 crc kubenswrapper[4666]: E1203 12:15:31.417408 4666 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 12:15:31 crc kubenswrapper[4666]: I1203 12:15:31.423023 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:31 crc kubenswrapper[4666]: E1203 12:15:31.424081 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:31 crc kubenswrapper[4666]: E1203 12:15:31.566080 4666 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:15:32 crc kubenswrapper[4666]: I1203 12:15:32.423320 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:32 crc kubenswrapper[4666]: I1203 12:15:32.423388 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:32 crc kubenswrapper[4666]: I1203 12:15:32.423344 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:32 crc kubenswrapper[4666]: E1203 12:15:32.423490 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:32 crc kubenswrapper[4666]: E1203 12:15:32.423738 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:32 crc kubenswrapper[4666]: E1203 12:15:32.423820 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:33 crc kubenswrapper[4666]: I1203 12:15:33.422627 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:33 crc kubenswrapper[4666]: E1203 12:15:33.422791 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:34 crc kubenswrapper[4666]: I1203 12:15:34.423432 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:34 crc kubenswrapper[4666]: I1203 12:15:34.423482 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:34 crc kubenswrapper[4666]: I1203 12:15:34.423544 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:34 crc kubenswrapper[4666]: E1203 12:15:34.423616 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:34 crc kubenswrapper[4666]: E1203 12:15:34.423740 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:34 crc kubenswrapper[4666]: E1203 12:15:34.423908 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:35 crc kubenswrapper[4666]: I1203 12:15:35.422671 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:35 crc kubenswrapper[4666]: E1203 12:15:35.422876 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:36 crc kubenswrapper[4666]: I1203 12:15:36.422804 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:36 crc kubenswrapper[4666]: I1203 12:15:36.422804 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:36 crc kubenswrapper[4666]: E1203 12:15:36.422965 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:36 crc kubenswrapper[4666]: I1203 12:15:36.422822 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:36 crc kubenswrapper[4666]: E1203 12:15:36.423177 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:36 crc kubenswrapper[4666]: E1203 12:15:36.423175 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:36 crc kubenswrapper[4666]: E1203 12:15:36.568164 4666 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:15:37 crc kubenswrapper[4666]: I1203 12:15:37.423035 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:37 crc kubenswrapper[4666]: E1203 12:15:37.423265 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:38 crc kubenswrapper[4666]: I1203 12:15:38.423339 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:38 crc kubenswrapper[4666]: I1203 12:15:38.423717 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:38 crc kubenswrapper[4666]: E1203 12:15:38.423864 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:38 crc kubenswrapper[4666]: I1203 12:15:38.423931 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:38 crc kubenswrapper[4666]: E1203 12:15:38.424396 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:38 crc kubenswrapper[4666]: I1203 12:15:38.424559 4666 scope.go:117] "RemoveContainer" containerID="8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd" Dec 03 12:15:38 crc kubenswrapper[4666]: E1203 12:15:38.424760 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:39 crc kubenswrapper[4666]: I1203 12:15:39.423859 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:39 crc kubenswrapper[4666]: E1203 12:15:39.424724 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:40 crc kubenswrapper[4666]: I1203 12:15:40.187656 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/1.log" Dec 03 12:15:40 crc kubenswrapper[4666]: I1203 12:15:40.187789 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerStarted","Data":"fcfa2f98a9da4e4ba0160ca2261d53489e3922f93e699e769d5c745afc146156"} Dec 03 12:15:40 crc kubenswrapper[4666]: I1203 12:15:40.423460 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:40 crc kubenswrapper[4666]: I1203 12:15:40.423502 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:40 crc kubenswrapper[4666]: I1203 12:15:40.423531 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:40 crc kubenswrapper[4666]: E1203 12:15:40.423642 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:40 crc kubenswrapper[4666]: E1203 12:15:40.423895 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:40 crc kubenswrapper[4666]: E1203 12:15:40.423976 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:41 crc kubenswrapper[4666]: I1203 12:15:41.423309 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:41 crc kubenswrapper[4666]: E1203 12:15:41.425630 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:41 crc kubenswrapper[4666]: E1203 12:15:41.568843 4666 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:15:42 crc kubenswrapper[4666]: I1203 12:15:42.422901 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:42 crc kubenswrapper[4666]: I1203 12:15:42.422985 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:42 crc kubenswrapper[4666]: I1203 12:15:42.422998 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:42 crc kubenswrapper[4666]: E1203 12:15:42.423646 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:42 crc kubenswrapper[4666]: E1203 12:15:42.423696 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:42 crc kubenswrapper[4666]: E1203 12:15:42.423458 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:43 crc kubenswrapper[4666]: I1203 12:15:43.422912 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:43 crc kubenswrapper[4666]: E1203 12:15:43.424075 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:43 crc kubenswrapper[4666]: I1203 12:15:43.425511 4666 scope.go:117] "RemoveContainer" containerID="801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26" Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.205643 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/3.log" Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.208451 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerStarted","Data":"0bb44670867ae612d8f85b3373547805b8e13748b98e3083804cee3d938801c7"} Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.208862 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.250501 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podStartSLOduration=113.250481084 podStartE2EDuration="1m53.250481084s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:44.250151565 +0000 UTC m=+133.095112616" watchObservedRunningTime="2025-12-03 12:15:44.250481084 +0000 UTC m=+133.095442135" Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.343834 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s4f78"] Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.343959 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:44 crc kubenswrapper[4666]: E1203 12:15:44.344063 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.422555 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:44 crc kubenswrapper[4666]: E1203 12:15:44.423340 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:44 crc kubenswrapper[4666]: I1203 12:15:44.422598 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:44 crc kubenswrapper[4666]: E1203 12:15:44.423545 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:45 crc kubenswrapper[4666]: I1203 12:15:45.423292 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:45 crc kubenswrapper[4666]: E1203 12:15:45.423511 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:46 crc kubenswrapper[4666]: I1203 12:15:46.423350 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:46 crc kubenswrapper[4666]: I1203 12:15:46.423517 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:46 crc kubenswrapper[4666]: I1203 12:15:46.423642 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:46 crc kubenswrapper[4666]: E1203 12:15:46.423632 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:46 crc kubenswrapper[4666]: E1203 12:15:46.423811 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:46 crc kubenswrapper[4666]: E1203 12:15:46.423876 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:46 crc kubenswrapper[4666]: E1203 12:15:46.570613 4666 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 12:15:47 crc kubenswrapper[4666]: I1203 12:15:47.423743 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:47 crc kubenswrapper[4666]: E1203 12:15:47.423973 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:48 crc kubenswrapper[4666]: I1203 12:15:48.423366 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:48 crc kubenswrapper[4666]: E1203 12:15:48.423571 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:48 crc kubenswrapper[4666]: I1203 12:15:48.423852 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:48 crc kubenswrapper[4666]: E1203 12:15:48.423942 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:48 crc kubenswrapper[4666]: I1203 12:15:48.424208 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:48 crc kubenswrapper[4666]: E1203 12:15:48.424354 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:49 crc kubenswrapper[4666]: I1203 12:15:49.423215 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:49 crc kubenswrapper[4666]: E1203 12:15:49.423438 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:50 crc kubenswrapper[4666]: I1203 12:15:50.422871 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:50 crc kubenswrapper[4666]: I1203 12:15:50.422918 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:50 crc kubenswrapper[4666]: E1203 12:15:50.423118 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 12:15:50 crc kubenswrapper[4666]: I1203 12:15:50.423208 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:50 crc kubenswrapper[4666]: E1203 12:15:50.423419 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s4f78" podUID="1889fa0a-c57e-4b03-884b-f096236b084b" Dec 03 12:15:50 crc kubenswrapper[4666]: E1203 12:15:50.423553 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 12:15:51 crc kubenswrapper[4666]: I1203 12:15:51.423058 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:51 crc kubenswrapper[4666]: E1203 12:15:51.424017 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.423006 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.423112 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.423133 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.427066 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.427291 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.428296 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.428328 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.428343 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 12:15:52 crc kubenswrapper[4666]: I1203 12:15:52.428826 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 12:15:53 crc kubenswrapper[4666]: I1203 12:15:53.422721 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.332449 4666 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.410148 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rkcvk"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.410887 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.411263 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.412310 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.413170 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.413339 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.413740 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.414131 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.414192 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.414996 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.415331 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.415582 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rxcq5"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.416219 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.416831 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5czcz"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.417170 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.417788 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cm86d"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.418570 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.419493 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.420223 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.420687 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lts42"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.421262 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.421639 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.422003 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.424590 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.424861 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.424968 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425142 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425195 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425348 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425416 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425554 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425606 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425804 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.426014 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.425564 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.426156 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.426446 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.426561 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.426616 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.426791 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.426980 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.427022 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.427782 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.431671 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.431929 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.432450 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.432544 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.433029 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.431769 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.434505 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.436390 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.449740 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.450512 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.450644 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.450525 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wj872"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.450813 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.451532 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.451968 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.452376 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.452447 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.452505 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.452376 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.452727 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv5pm"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453048 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453105 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453138 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453231 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453399 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453526 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453525 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453556 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453583 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453613 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f4r9r"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.453929 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.454755 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dn4sx"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455040 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455049 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455202 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455429 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455473 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455543 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455627 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.455874 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.456593 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.456759 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.456991 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.456992 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457188 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457331 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457436 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457471 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457528 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457628 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457640 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457640 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.457930 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.458645 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.458945 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.459057 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.459411 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.460155 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.461877 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.462567 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.462759 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.463689 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.463883 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.464604 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.464928 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.466375 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.467318 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.468057 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f4r9r"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.469575 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.469622 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr"] Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.471322 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.473465 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.474048 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.474281 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.474505 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.483895 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.485078 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.485207 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.485484 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.486302 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.488042 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.491760 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 12:15:57 crc kubenswrapper[4666]: I1203 12:15:57.493121 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.687487 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.687729 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.688825 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.692831 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.694824 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703307 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703357 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703385 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-service-ca-bundle\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703417 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703448 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-config\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703471 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703497 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-encryption-config\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703520 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-config\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703551 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmrpz\" (UniqueName: \"kubernetes.io/projected/7583101e-f814-41d1-9b78-086c48e16385-kube-api-access-vmrpz\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703575 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4r66\" (UniqueName: \"kubernetes.io/projected/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-kube-api-access-q4r66\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703597 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gp8j\" (UniqueName: \"kubernetes.io/projected/6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca-kube-api-access-5gp8j\") pod \"downloads-7954f5f757-lts42\" (UID: \"6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca\") " pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703619 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-config\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703658 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-oauth-serving-cert\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703683 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703705 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa39afc3-04ae-42c9-b042-15a136a64fb4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703727 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-client-ca\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703787 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703814 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx7zf\" (UniqueName: \"kubernetes.io/projected/2cddd4e1-3283-4f65-a1bb-68d449471280-kube-api-access-tx7zf\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703839 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703861 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9731be9-1a36-490c-a448-222468842a67-audit-dir\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703883 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703910 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703932 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-audit\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.703973 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-serving-cert\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704010 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-policies\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704032 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06646f23-6df2-4308-9b72-fd7e108ad6e0-config\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704054 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8nf\" (UniqueName: \"kubernetes.io/projected/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-kube-api-access-kr8nf\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704078 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxmk\" (UniqueName: \"kubernetes.io/projected/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-kube-api-access-vwxmk\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704122 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-machine-approver-tls\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704145 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-serving-cert\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704169 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6bf85a8-fe97-4a33-9441-04cb4c949118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704191 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb958399-7029-46f6-a41c-0cf7823af900-audit-dir\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704214 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwsxn\" (UniqueName: \"kubernetes.io/projected/95b963c3-6c15-49a4-9e37-2e16d825e46f-kube-api-access-cwsxn\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704261 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-serving-cert\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704299 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704328 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5zn9\" (UniqueName: \"kubernetes.io/projected/86ecf1c7-b9b3-49b1-9b6a-421e96100984-kube-api-access-z5zn9\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704355 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d685477-11f2-4bfb-98c2-6eb76b6697c3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704379 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-etcd-client\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704423 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95b963c3-6c15-49a4-9e37-2e16d825e46f-serving-cert\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704456 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5kxk\" (UniqueName: \"kubernetes.io/projected/7d685477-11f2-4bfb-98c2-6eb76b6697c3-kube-api-access-d5kxk\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704491 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9731be9-1a36-490c-a448-222468842a67-node-pullsecrets\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704516 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-etcd-serving-ca\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704517 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704629 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705023 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.704540 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-auth-proxy-config\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705255 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-serving-cert\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705308 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52v87\" (UniqueName: \"kubernetes.io/projected/bb958399-7029-46f6-a41c-0cf7823af900-kube-api-access-52v87\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705345 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-config\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705408 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705446 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705473 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa39afc3-04ae-42c9-b042-15a136a64fb4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705502 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-config\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705530 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-dir\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705612 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-audit-policies\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705714 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705944 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qn6g\" (UniqueName: \"kubernetes.io/projected/f45da577-5ce5-4221-a450-d12a58efb053-kube-api-access-9qn6g\") pod \"cluster-samples-operator-665b6dd947-8zltk\" (UID: \"f45da577-5ce5-4221-a450-d12a58efb053\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.705983 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj492\" (UniqueName: \"kubernetes.io/projected/e6bf85a8-fe97-4a33-9441-04cb4c949118-kube-api-access-wj492\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706031 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-image-import-ca\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706049 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706073 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d685477-11f2-4bfb-98c2-6eb76b6697c3-config\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706110 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e6bf85a8-fe97-4a33-9441-04cb4c949118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706129 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-client-ca\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706147 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/95b963c3-6c15-49a4-9e37-2e16d825e46f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706255 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-service-ca\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706316 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-config\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706448 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706558 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f45da577-5ce5-4221-a450-d12a58efb053-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8zltk\" (UID: \"f45da577-5ce5-4221-a450-d12a58efb053\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706662 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06646f23-6df2-4308-9b72-fd7e108ad6e0-trusted-ca\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706756 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7d1dfeb-f488-4924-8d1c-dc8a32a124aa-metrics-tls\") pod \"dns-operator-744455d44c-cm86d\" (UID: \"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706900 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583101e-f814-41d1-9b78-086c48e16385-serving-cert\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706973 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-encryption-config\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.706992 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gnkx\" (UniqueName: \"kubernetes.io/projected/b9731be9-1a36-490c-a448-222468842a67-kube-api-access-2gnkx\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707016 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-trusted-ca-bundle\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707033 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-etcd-client\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707062 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjh9x\" (UniqueName: \"kubernetes.io/projected/f7d1dfeb-f488-4924-8d1c-dc8a32a124aa-kube-api-access-vjh9x\") pod \"dns-operator-744455d44c-cm86d\" (UID: \"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707081 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwsb\" (UniqueName: \"kubernetes.io/projected/aa39afc3-04ae-42c9-b042-15a136a64fb4-kube-api-access-lzwsb\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707169 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707204 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06646f23-6df2-4308-9b72-fd7e108ad6e0-serving-cert\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707254 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707335 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d685477-11f2-4bfb-98c2-6eb76b6697c3-images\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707362 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz8qb\" (UniqueName: \"kubernetes.io/projected/06646f23-6df2-4308-9b72-fd7e108ad6e0-kube-api-access-zz8qb\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707389 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ecf1c7-b9b3-49b1-9b6a-421e96100984-serving-cert\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707478 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-oauth-config\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.707519 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6bf85a8-fe97-4a33-9441-04cb4c949118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.728655 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.730406 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.733555 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.735231 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsm4g"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.736102 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.736383 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.737188 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.741417 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.741523 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.742987 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.743851 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.744376 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt6lp"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.745156 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.749651 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.749925 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.750784 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fgm9v"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.751462 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.751708 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.753805 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.754028 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.754389 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.754583 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.761309 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv5pm"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.761440 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.762392 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.762522 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.763176 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.770538 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.770918 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.771226 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.771381 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.771742 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.772152 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.773150 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.773349 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.773676 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.773362 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.776283 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k4gbz"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.777222 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779003 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779244 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779374 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779499 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779627 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779754 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779868 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.779990 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.780121 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.780265 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.792170 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.792441 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.792622 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.792746 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.792870 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.798677 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.799098 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.800037 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.801188 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.801664 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.802024 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.803058 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.804902 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.806003 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.806862 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.807032 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.806937 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.808010 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809044 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809117 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa39afc3-04ae-42c9-b042-15a136a64fb4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809150 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-client-ca\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809187 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-ca\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809221 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4cdba64-e137-43f9-a7a9-ced14dde212e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809255 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809280 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx7zf\" (UniqueName: \"kubernetes.io/projected/2cddd4e1-3283-4f65-a1bb-68d449471280-kube-api-access-tx7zf\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809307 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4cdba64-e137-43f9-a7a9-ced14dde212e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809334 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809358 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9731be9-1a36-490c-a448-222468842a67-audit-dir\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809385 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809413 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809439 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-audit\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809466 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-serving-cert\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809511 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-policies\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809537 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06646f23-6df2-4308-9b72-fd7e108ad6e0-config\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809567 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8nf\" (UniqueName: \"kubernetes.io/projected/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-kube-api-access-kr8nf\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809597 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-client\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809626 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-serving-cert\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809660 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6bf85a8-fe97-4a33-9441-04cb4c949118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809683 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxmk\" (UniqueName: \"kubernetes.io/projected/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-kube-api-access-vwxmk\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809712 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-machine-approver-tls\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809749 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb958399-7029-46f6-a41c-0cf7823af900-audit-dir\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809776 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwsxn\" (UniqueName: \"kubernetes.io/projected/95b963c3-6c15-49a4-9e37-2e16d825e46f-kube-api-access-cwsxn\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809801 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-serving-cert\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809827 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-serving-cert\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809853 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809881 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5zn9\" (UniqueName: \"kubernetes.io/projected/86ecf1c7-b9b3-49b1-9b6a-421e96100984-kube-api-access-z5zn9\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809910 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4cdba64-e137-43f9-a7a9-ced14dde212e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809933 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-service-ca\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809967 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d685477-11f2-4bfb-98c2-6eb76b6697c3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.809992 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-etcd-client\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810016 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95b963c3-6c15-49a4-9e37-2e16d825e46f-serving-cert\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810036 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5kxk\" (UniqueName: \"kubernetes.io/projected/7d685477-11f2-4bfb-98c2-6eb76b6697c3-kube-api-access-d5kxk\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810062 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9731be9-1a36-490c-a448-222468842a67-node-pullsecrets\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810119 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-etcd-serving-ca\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810150 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-auth-proxy-config\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810174 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-serving-cert\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810200 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-config\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810225 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810262 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52v87\" (UniqueName: \"kubernetes.io/projected/bb958399-7029-46f6-a41c-0cf7823af900-kube-api-access-52v87\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810288 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810309 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa39afc3-04ae-42c9-b042-15a136a64fb4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810334 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-dir\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810361 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-audit-policies\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810385 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-config\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810423 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qn6g\" (UniqueName: \"kubernetes.io/projected/f45da577-5ce5-4221-a450-d12a58efb053-kube-api-access-9qn6g\") pod \"cluster-samples-operator-665b6dd947-8zltk\" (UID: \"f45da577-5ce5-4221-a450-d12a58efb053\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810450 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj492\" (UniqueName: \"kubernetes.io/projected/e6bf85a8-fe97-4a33-9441-04cb4c949118-kube-api-access-wj492\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810478 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-config\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810502 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810526 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-audit\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810535 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-image-import-ca\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810562 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-service-ca\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810587 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d685477-11f2-4bfb-98c2-6eb76b6697c3-config\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810609 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e6bf85a8-fe97-4a33-9441-04cb4c949118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810635 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-client-ca\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810661 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/95b963c3-6c15-49a4-9e37-2e16d825e46f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810690 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-config\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810718 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810758 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-encryption-config\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810783 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gnkx\" (UniqueName: \"kubernetes.io/projected/b9731be9-1a36-490c-a448-222468842a67-kube-api-access-2gnkx\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810828 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f45da577-5ce5-4221-a450-d12a58efb053-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8zltk\" (UID: \"f45da577-5ce5-4221-a450-d12a58efb053\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810853 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06646f23-6df2-4308-9b72-fd7e108ad6e0-trusted-ca\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810879 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7d1dfeb-f488-4924-8d1c-dc8a32a124aa-metrics-tls\") pod \"dns-operator-744455d44c-cm86d\" (UID: \"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810903 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583101e-f814-41d1-9b78-086c48e16385-serving-cert\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.810990 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-trusted-ca-bundle\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811026 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-etcd-client\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811054 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjh9x\" (UniqueName: \"kubernetes.io/projected/f7d1dfeb-f488-4924-8d1c-dc8a32a124aa-kube-api-access-vjh9x\") pod \"dns-operator-744455d44c-cm86d\" (UID: \"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811101 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwsb\" (UniqueName: \"kubernetes.io/projected/aa39afc3-04ae-42c9-b042-15a136a64fb4-kube-api-access-lzwsb\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811137 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811169 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06646f23-6df2-4308-9b72-fd7e108ad6e0-serving-cert\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811202 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbzcp\" (UniqueName: \"kubernetes.io/projected/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-kube-api-access-mbzcp\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811232 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811262 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d685477-11f2-4bfb-98c2-6eb76b6697c3-images\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811287 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz8qb\" (UniqueName: \"kubernetes.io/projected/06646f23-6df2-4308-9b72-fd7e108ad6e0-kube-api-access-zz8qb\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811312 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ecf1c7-b9b3-49b1-9b6a-421e96100984-serving-cert\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811411 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-oauth-config\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811439 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6bf85a8-fe97-4a33-9441-04cb4c949118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811469 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811499 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811541 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-service-ca-bundle\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811566 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811594 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-encryption-config\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811635 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811664 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-config\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811694 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-config\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811725 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmrpz\" (UniqueName: \"kubernetes.io/projected/7583101e-f814-41d1-9b78-086c48e16385-kube-api-access-vmrpz\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811777 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-oauth-serving-cert\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811813 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4r66\" (UniqueName: \"kubernetes.io/projected/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-kube-api-access-q4r66\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811849 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gp8j\" (UniqueName: \"kubernetes.io/projected/6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca-kube-api-access-5gp8j\") pod \"downloads-7954f5f757-lts42\" (UID: \"6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca\") " pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.811879 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-config\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.812537 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.814896 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.838676 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-image-import-ca\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.839881 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-service-ca\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.839924 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.841075 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.841552 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.841656 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa39afc3-04ae-42c9-b042-15a136a64fb4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.842111 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.842270 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6bf85a8-fe97-4a33-9441-04cb4c949118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.842538 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.842636 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.842996 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-serving-cert\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.843179 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.845347 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-dir\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.845706 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d685477-11f2-4bfb-98c2-6eb76b6697c3-config\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.848339 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-config\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.848727 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-oauth-config\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.849019 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06646f23-6df2-4308-9b72-fd7e108ad6e0-config\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.850285 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-oauth-serving-cert\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.851226 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-machine-approver-tls\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.853452 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.853668 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa39afc3-04ae-42c9-b042-15a136a64fb4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.859691 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-service-ca-bundle\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.860808 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-config\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.862065 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06646f23-6df2-4308-9b72-fd7e108ad6e0-trusted-ca\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.862514 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecf1c7-b9b3-49b1-9b6a-421e96100984-config\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.863251 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7d1dfeb-f488-4924-8d1c-dc8a32a124aa-metrics-tls\") pod \"dns-operator-744455d44c-cm86d\" (UID: \"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.863697 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qgbrh"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.864447 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.864780 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rxcq5"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.864897 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.865195 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-etcd-serving-ca\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.865263 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9731be9-1a36-490c-a448-222468842a67-node-pullsecrets\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.867620 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-encryption-config\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.867629 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.869494 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-config\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.869826 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/95b963c3-6c15-49a4-9e37-2e16d825e46f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.870654 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.871274 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-auth-proxy-config\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.872346 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-config\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.873069 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583101e-f814-41d1-9b78-086c48e16385-serving-cert\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.875575 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.875634 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-audit-policies\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.876369 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-client-ca\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.876897 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9731be9-1a36-490c-a448-222468842a67-audit-dir\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.877008 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6bf85a8-fe97-4a33-9441-04cb4c949118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.877685 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d685477-11f2-4bfb-98c2-6eb76b6697c3-images\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.877857 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-policies\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.877915 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb958399-7029-46f6-a41c-0cf7823af900-audit-dir\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.878017 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.878255 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.882514 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.882709 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.883402 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bb958399-7029-46f6-a41c-0cf7823af900-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.883562 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.884261 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dn4sx"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.884366 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.886420 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.886615 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.887122 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e6bf85a8-fe97-4a33-9441-04cb4c949118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.887344 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4r66\" (UniqueName: \"kubernetes.io/projected/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-kube-api-access-q4r66\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.888706 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95b963c3-6c15-49a4-9e37-2e16d825e46f-serving-cert\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.889069 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.889572 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9731be9-1a36-490c-a448-222468842a67-config\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.890880 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-trusted-ca-bundle\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.896746 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-client-ca\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.898161 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.902215 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d685477-11f2-4bfb-98c2-6eb76b6697c3-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.902560 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ecf1c7-b9b3-49b1-9b6a-421e96100984-serving-cert\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.902608 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.905225 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj492\" (UniqueName: \"kubernetes.io/projected/e6bf85a8-fe97-4a33-9441-04cb4c949118-kube-api-access-wj492\") pod \"cluster-image-registry-operator-dc59b4c8b-56n9f\" (UID: \"e6bf85a8-fe97-4a33-9441-04cb4c949118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.906353 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.906758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-serving-cert\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.907272 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwsxn\" (UniqueName: \"kubernetes.io/projected/95b963c3-6c15-49a4-9e37-2e16d825e46f-kube-api-access-cwsxn\") pod \"openshift-config-operator-7777fb866f-wjhvq\" (UID: \"95b963c3-6c15-49a4-9e37-2e16d825e46f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.908983 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5czcz"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.909022 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsm4g"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.909034 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.909046 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.909444 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.909787 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.910729 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.910943 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjh9x\" (UniqueName: \"kubernetes.io/projected/f7d1dfeb-f488-4924-8d1c-dc8a32a124aa-kube-api-access-vjh9x\") pod \"dns-operator-744455d44c-cm86d\" (UID: \"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.911032 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.913882 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwsb\" (UniqueName: \"kubernetes.io/projected/aa39afc3-04ae-42c9-b042-15a136a64fb4-kube-api-access-lzwsb\") pod \"openshift-apiserver-operator-796bbdcf4f-tlpmn\" (UID: \"aa39afc3-04ae-42c9-b042-15a136a64fb4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.911166 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.911483 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.911621 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.912568 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-trusted-ca\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.913979 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-88qxb"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.913256 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gp8j\" (UniqueName: \"kubernetes.io/projected/6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca-kube-api-access-5gp8j\") pod \"downloads-7954f5f757-lts42\" (UID: \"6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca\") " pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914028 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mc7\" (UniqueName: \"kubernetes.io/projected/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-kube-api-access-t9mc7\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914062 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a60617f-4a55-4818-a37a-2d745a296b97-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914152 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-tls\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.913411 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914246 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-bound-sa-token\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914289 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/551310dc-b537-4268-a72a-899169944815-metrics-tls\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.913011 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-serving-cert\") pod \"console-f9d7485db-rxcq5\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914347 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914401 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09abca61-c4c9-4f0c-beb9-468eea7e3f95-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914427 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ttrx\" (UniqueName: \"kubernetes.io/projected/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-kube-api-access-6ttrx\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914493 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4cdba64-e137-43f9-a7a9-ced14dde212e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914525 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4cdba64-e137-43f9-a7a9-ced14dde212e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914551 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64s6l\" (UniqueName: \"kubernetes.io/projected/551310dc-b537-4268-a72a-899169944815-kube-api-access-64s6l\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914577 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-metrics-tls\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914626 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-866sg\" (UniqueName: \"kubernetes.io/projected/eadd4d63-6584-4b28-a233-8274a7941462-kube-api-access-866sg\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914675 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914706 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-client\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914727 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09abca61-c4c9-4f0c-beb9-468eea7e3f95-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914771 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4cdba64-e137-43f9-a7a9-ced14dde212e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914793 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-service-ca\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914816 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eadd4d63-6584-4b28-a233-8274a7941462-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914863 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-config\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914909 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jpgf\" (UniqueName: \"kubernetes.io/projected/779f8827-42c7-4e5a-a89b-ba23d7e11e14-kube-api-access-9jpgf\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:58 crc kubenswrapper[4666]: E1203 12:15:58.914924 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.414906693 +0000 UTC m=+148.259867744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.914977 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-certificates\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915012 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbzcp\" (UniqueName: \"kubernetes.io/projected/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-kube-api-access-mbzcp\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915032 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-service-ca-bundle\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915066 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-trusted-ca\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915133 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915084 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-stats-auth\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915682 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60617f-4a55-4818-a37a-2d745a296b97-config\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915708 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/779f8827-42c7-4e5a-a89b-ba23d7e11e14-proxy-tls\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915740 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a60617f-4a55-4818-a37a-2d745a296b97-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915874 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eadd4d63-6584-4b28-a233-8274a7941462-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915922 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmtv\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-kube-api-access-wqmtv\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.916996 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-ca\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.917191 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/551310dc-b537-4268-a72a-899169944815-config-volume\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.917353 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-default-certificate\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.917397 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/779f8827-42c7-4e5a-a89b-ba23d7e11e14-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.919253 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-serving-cert\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.916898 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-service-ca\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.916151 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-etcd-client\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.912122 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx7zf\" (UniqueName: \"kubernetes.io/projected/2cddd4e1-3283-4f65-a1bb-68d449471280-kube-api-access-tx7zf\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.917478 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5czcz\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.915030 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kbcqm"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.917396 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4cdba64-e137-43f9-a7a9-ced14dde212e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.918102 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxmk\" (UniqueName: \"kubernetes.io/projected/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-kube-api-access-vwxmk\") pod \"controller-manager-879f6c89f-lv5pm\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.918156 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-ca\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.918365 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-config\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.919805 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-metrics-certs\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.918742 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-encryption-config\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.918760 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb958399-7029-46f6-a41c-0cf7823af900-serving-cert\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.918398 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06646f23-6df2-4308-9b72-fd7e108ad6e0-serving-cert\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.916778 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f45da577-5ce5-4221-a450-d12a58efb053-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8zltk\" (UID: \"f45da577-5ce5-4221-a450-d12a58efb053\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.920694 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.920926 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.921652 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.921806 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.922679 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5zn9\" (UniqueName: \"kubernetes.io/projected/86ecf1c7-b9b3-49b1-9b6a-421e96100984-kube-api-access-z5zn9\") pod \"authentication-operator-69f744f599-f4r9r\" (UID: \"86ecf1c7-b9b3-49b1-9b6a-421e96100984\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.923158 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-etcd-client\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.924598 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-serving-cert\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.925145 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d42mb"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.925311 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.926023 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9731be9-1a36-490c-a448-222468842a67-etcd-client\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.926362 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.926573 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.926783 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.927079 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.927581 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wj872"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.927606 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nn69f"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.927703 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.928232 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgdb2"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.928348 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.929153 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.929224 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.931918 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.932000 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.935451 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.938905 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5kxk\" (UniqueName: \"kubernetes.io/projected/7d685477-11f2-4bfb-98c2-6eb76b6697c3-kube-api-access-d5kxk\") pod \"machine-api-operator-5694c8668f-dn4sx\" (UID: \"7d685477-11f2-4bfb-98c2-6eb76b6697c3\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.940366 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz8qb\" (UniqueName: \"kubernetes.io/projected/06646f23-6df2-4308-9b72-fd7e108ad6e0-kube-api-access-zz8qb\") pod \"console-operator-58897d9998-wj872\" (UID: \"06646f23-6df2-4308-9b72-fd7e108ad6e0\") " pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.940531 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4cdba64-e137-43f9-a7a9-ced14dde212e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.945282 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmrpz\" (UniqueName: \"kubernetes.io/projected/7583101e-f814-41d1-9b78-086c48e16385-kube-api-access-vmrpz\") pod \"route-controller-manager-6576b87f9c-vdcsd\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.949747 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lts42"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.952885 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.953293 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.954236 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.960033 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qn6g\" (UniqueName: \"kubernetes.io/projected/f45da577-5ce5-4221-a450-d12a58efb053-kube-api-access-9qn6g\") pod \"cluster-samples-operator-665b6dd947-8zltk\" (UID: \"f45da577-5ce5-4221-a450-d12a58efb053\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.960959 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.973151 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cm86d"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.974149 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.974309 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.985494 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.985550 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d42mb"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.988181 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.988550 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.990674 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k4gbz"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.993936 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qgbrh"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.993987 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.994748 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.995838 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.998849 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk"] Dec 03 12:15:58 crc kubenswrapper[4666]: I1203 12:15:58.998886 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt6lp"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.001765 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.003327 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rkcvk"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.005274 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nn69f"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.006344 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.009999 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.011317 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.012360 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.013534 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgdb2"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.015209 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52v87\" (UniqueName: \"kubernetes.io/projected/bb958399-7029-46f6-a41c-0cf7823af900-kube-api-access-52v87\") pod \"apiserver-7bbb656c7d-bgzdr\" (UID: \"bb958399-7029-46f6-a41c-0cf7823af900\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.019264 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.019317 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-88qxb"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020339 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020642 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/551310dc-b537-4268-a72a-899169944815-config-volume\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020701 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-default-certificate\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020730 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/779f8827-42c7-4e5a-a89b-ba23d7e11e14-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020770 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-metrics-certs\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020800 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-trusted-ca\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020819 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mc7\" (UniqueName: \"kubernetes.io/projected/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-kube-api-access-t9mc7\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020836 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a60617f-4a55-4818-a37a-2d745a296b97-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020855 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-tls\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020955 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-bound-sa-token\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020971 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/551310dc-b537-4268-a72a-899169944815-metrics-tls\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.020998 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09abca61-c4c9-4f0c-beb9-468eea7e3f95-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021023 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ttrx\" (UniqueName: \"kubernetes.io/projected/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-kube-api-access-6ttrx\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021055 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64s6l\" (UniqueName: \"kubernetes.io/projected/551310dc-b537-4268-a72a-899169944815-kube-api-access-64s6l\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021073 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-metrics-tls\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021117 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-866sg\" (UniqueName: \"kubernetes.io/projected/eadd4d63-6584-4b28-a233-8274a7941462-kube-api-access-866sg\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021162 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021209 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09abca61-c4c9-4f0c-beb9-468eea7e3f95-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021242 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eadd4d63-6584-4b28-a233-8274a7941462-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021298 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jpgf\" (UniqueName: \"kubernetes.io/projected/779f8827-42c7-4e5a-a89b-ba23d7e11e14-kube-api-access-9jpgf\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021326 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-certificates\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021372 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-service-ca-bundle\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.021419 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-trusted-ca\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.021918 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.521847085 +0000 UTC m=+148.366808136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.022881 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/551310dc-b537-4268-a72a-899169944815-config-volume\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.023244 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-stats-auth\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.023319 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a60617f-4a55-4818-a37a-2d745a296b97-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.023341 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60617f-4a55-4818-a37a-2d745a296b97-config\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.023357 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/779f8827-42c7-4e5a-a89b-ba23d7e11e14-proxy-tls\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.023401 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eadd4d63-6584-4b28-a233-8274a7941462-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.023434 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmtv\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-kube-api-access-wqmtv\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.024210 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.024272 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.025706 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-trusted-ca\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.026305 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/779f8827-42c7-4e5a-a89b-ba23d7e11e14-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.029563 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09abca61-c4c9-4f0c-beb9-468eea7e3f95-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.034715 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09abca61-c4c9-4f0c-beb9-468eea7e3f95-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.035748 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-tls\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.037838 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-certificates\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.038364 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60617f-4a55-4818-a37a-2d745a296b97-config\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.040864 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-service-ca-bundle\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.041780 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.043750 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eadd4d63-6584-4b28-a233-8274a7941462-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.045683 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.046626 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a60617f-4a55-4818-a37a-2d745a296b97-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.047148 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eadd4d63-6584-4b28-a233-8274a7941462-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.047501 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.048393 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-stats-auth\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.048847 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/779f8827-42c7-4e5a-a89b-ba23d7e11e14-proxy-tls\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.064138 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.067340 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.068366 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gnkx\" (UniqueName: \"kubernetes.io/projected/b9731be9-1a36-490c-a448-222468842a67-kube-api-access-2gnkx\") pod \"apiserver-76f77b778f-rkcvk\" (UID: \"b9731be9-1a36-490c-a448-222468842a67\") " pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.072058 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-trusted-ca\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.073367 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-default-certificate\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.073668 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.073960 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-metrics-certs\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.077857 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/551310dc-b537-4268-a72a-899169944815-metrics-tls\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.081701 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-metrics-tls\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.094370 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.101065 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.112318 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.124805 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.125284 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.625268559 +0000 UTC m=+148.470229610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.131481 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8nf\" (UniqueName: \"kubernetes.io/projected/9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5-kube-api-access-kr8nf\") pod \"machine-approver-56656f9798-8gr95\" (UID: \"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.153920 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.175877 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.206798 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.213304 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.225509 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.225959 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.725936527 +0000 UTC m=+148.570897578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.228118 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.246873 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.247050 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.255880 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.259959 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.275385 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.295891 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.313881 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.329042 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.329539 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.829523695 +0000 UTC m=+148.674484746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.332965 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.354054 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.373733 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.374727 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.393838 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.413461 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.430779 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.431301 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.432126 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.931665413 +0000 UTC m=+148.776626464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.432177 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.432216 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.432320 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.432372 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.433540 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.433914 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:15:59.933905365 +0000 UTC m=+148.778866416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.435701 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.435713 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.436474 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.437071 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.438194 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.455174 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.496390 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbzcp\" (UniqueName: \"kubernetes.io/projected/5c58a7e7-a332-4dd3-b7ed-cbaea0826134-kube-api-access-mbzcp\") pod \"etcd-operator-b45778765-fsm4g\" (UID: \"5c58a7e7-a332-4dd3-b7ed-cbaea0826134\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.496077 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.535263 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.535649 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.035626632 +0000 UTC m=+148.880587683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.542180 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.555127 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.599915 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.602513 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.602843 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.614142 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4cdba64-e137-43f9-a7a9-ced14dde212e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rn69k\" (UID: \"f4cdba64-e137-43f9-a7a9-ced14dde212e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.615460 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.638846 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.639442 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.640260 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.140191778 +0000 UTC m=+148.985153009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.649461 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.652356 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.663466 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.672772 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.690757 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.694048 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.713741 4666 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.714006 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" event={"ID":"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5","Type":"ContainerStarted","Data":"2daa7dce241d812621be1fdf247d7e4ad2324ae404319f4ea9d2914cb887d723"} Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.735944 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.741780 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.742280 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.242258184 +0000 UTC m=+149.087219235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.754395 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.785906 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.794200 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.837793 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.841708 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.843573 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.844008 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.343994811 +0000 UTC m=+149.188955862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.868300 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.890052 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.906775 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.929035 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.947874 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:15:59 crc kubenswrapper[4666]: E1203 12:15:59.948370 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.448349531 +0000 UTC m=+149.293310582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.948416 4666 request.go:700] Waited for 1.016258192s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackageserver-service-cert&limit=500&resourceVersion=0 Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.950615 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.953643 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dn4sx"] Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.960527 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.979609 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 12:15:59 crc kubenswrapper[4666]: I1203 12:15:59.983251 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv5pm"] Dec 03 12:15:59 crc kubenswrapper[4666]: W1203 12:15:59.993466 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d685477_11f2_4bfb_98c2_6eb76b6697c3.slice/crio-57f4e88be3c98210b5247786ea09c449b7a4175a81b12f1b98f5644880201ba9 WatchSource:0}: Error finding container 57f4e88be3c98210b5247786ea09c449b7a4175a81b12f1b98f5644880201ba9: Status 404 returned error can't find the container with id 57f4e88be3c98210b5247786ea09c449b7a4175a81b12f1b98f5644880201ba9 Dec 03 12:15:59 crc kubenswrapper[4666]: W1203 12:15:59.994685 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ecf1c7_b9b3_49b1_9b6a_421e96100984.slice/crio-14bb85ccc3469a495146ff6ed21d87308e3a23885fae51cf597fff6211b4effb WatchSource:0}: Error finding container 14bb85ccc3469a495146ff6ed21d87308e3a23885fae51cf597fff6211b4effb: Status 404 returned error can't find the container with id 14bb85ccc3469a495146ff6ed21d87308e3a23885fae51cf597fff6211b4effb Dec 03 12:16:00 crc kubenswrapper[4666]: W1203 12:15:59.998346 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b963c3_6c15_49a4_9e37_2e16d825e46f.slice/crio-1979ac674128b59c18266ba79ae9712d432c2039d23a8edce78ad1acdfc2da7d WatchSource:0}: Error finding container 1979ac674128b59c18266ba79ae9712d432c2039d23a8edce78ad1acdfc2da7d: Status 404 returned error can't find the container with id 1979ac674128b59c18266ba79ae9712d432c2039d23a8edce78ad1acdfc2da7d Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.022625 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f4r9r"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.026680 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.028513 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.031115 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.031816 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lts42"] Dec 03 12:16:00 crc kubenswrapper[4666]: W1203 12:16:00.038173 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7583101e_f814_41d1_9b78_086c48e16385.slice/crio-f893e7755991ae31c02a8e139f24f5295ef678fbc02109da5eca9af8791eaa1d WatchSource:0}: Error finding container f893e7755991ae31c02a8e139f24f5295ef678fbc02109da5eca9af8791eaa1d: Status 404 returned error can't find the container with id f893e7755991ae31c02a8e139f24f5295ef678fbc02109da5eca9af8791eaa1d Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.040893 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmtv\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-kube-api-access-wqmtv\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.050561 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.063564 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-bound-sa-token\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.063643 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cm86d"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.067858 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk"] Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.072544 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.57252285 +0000 UTC m=+149.417483891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.072936 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.080161 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rxcq5"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.087613 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5czcz"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.088566 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a60617f-4a55-4818-a37a-2d745a296b97-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8jjc5\" (UID: \"6a60617f-4a55-4818-a37a-2d745a296b97\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.096556 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mc7\" (UniqueName: \"kubernetes.io/projected/96e5f9ae-3c4b-4016-b1d5-c6a1a1326581-kube-api-access-t9mc7\") pod \"router-default-5444994796-fgm9v\" (UID: \"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581\") " pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.116687 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jpgf\" (UniqueName: \"kubernetes.io/projected/779f8827-42c7-4e5a-a89b-ba23d7e11e14-kube-api-access-9jpgf\") pod \"machine-config-controller-84d6567774-vx2dk\" (UID: \"779f8827-42c7-4e5a-a89b-ba23d7e11e14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.135111 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ttrx\" (UniqueName: \"kubernetes.io/projected/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-kube-api-access-6ttrx\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.155261 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.155489 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.655464497 +0000 UTC m=+149.500425548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.156377 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.156772 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.656760063 +0000 UTC m=+149.501721114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.163527 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64s6l\" (UniqueName: \"kubernetes.io/projected/551310dc-b537-4268-a72a-899169944815-kube-api-access-64s6l\") pod \"dns-default-k4gbz\" (UID: \"551310dc-b537-4268-a72a-899169944815\") " pod="openshift-dns/dns-default-k4gbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.176190 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24aef896-5b2d-4c09-8230-ce7bd5b3a0f2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7vz6p\" (UID: \"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.178903 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.188042 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-866sg\" (UniqueName: \"kubernetes.io/projected/eadd4d63-6584-4b28-a233-8274a7941462-kube-api-access-866sg\") pod \"openshift-controller-manager-operator-756b6f6bc6-5wd5d\" (UID: \"eadd4d63-6584-4b28-a233-8274a7941462\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.201210 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.214159 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.220212 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.258276 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.258815 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-signing-key\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.258887 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/39a33410-5d77-47cc-a5c7-d41a8431dba0-node-bootstrap-token\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.258913 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-secret-volume\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.258936 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-csi-data-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.258981 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmrk\" (UniqueName: \"kubernetes.io/projected/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-kube-api-access-9mmrk\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.258998 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-config-volume\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259028 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3e7281-d500-4cd3-bec9-daec25718f94-serving-cert\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259047 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8lh9\" (UniqueName: \"kubernetes.io/projected/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-kube-api-access-b8lh9\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259062 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbf4j\" (UniqueName: \"kubernetes.io/projected/a23456af-dc74-48f2-914b-84e7f6c549f1-kube-api-access-zbf4j\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259078 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fac2dbb-959e-4c17-993c-4e9593f00f99-config\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259140 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bsdf6\" (UID: \"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259241 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259260 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-plugins-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259285 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-signing-cabundle\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259304 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-mountpoint-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259320 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fac2dbb-959e-4c17-993c-4e9593f00f99-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259340 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a23456af-dc74-48f2-914b-84e7f6c549f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259358 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb2d553f-f328-499a-a93a-f7d62e54f118-cert\") pod \"ingress-canary-nn69f\" (UID: \"fb2d553f-f328-499a-a93a-f7d62e54f118\") " pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259426 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/214640b1-fa87-4af1-a790-173866c1263c-images\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259454 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7kf8\" (UniqueName: \"kubernetes.io/projected/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-kube-api-access-m7kf8\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259483 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259501 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m55bt\" (UniqueName: \"kubernetes.io/projected/eec32d83-8cb7-4573-a0eb-045cb7df0458-kube-api-access-m55bt\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259541 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzxg9\" (UniqueName: \"kubernetes.io/projected/8b3e7281-d500-4cd3-bec9-daec25718f94-kube-api-access-xzxg9\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259560 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrzzn\" (UniqueName: \"kubernetes.io/projected/fb2d553f-f328-499a-a93a-f7d62e54f118-kube-api-access-hrzzn\") pod \"ingress-canary-nn69f\" (UID: \"fb2d553f-f328-499a-a93a-f7d62e54f118\") " pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259595 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsq7g\" (UniqueName: \"kubernetes.io/projected/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-kube-api-access-rsq7g\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259612 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c3e34f2-4982-4638-8f87-92318d6105ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259678 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3e7281-d500-4cd3-bec9-daec25718f94-config\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259695 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c11ccbf0-027c-45ae-b6ea-78b49ba17d3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7q6pv\" (UID: \"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259729 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214640b1-fa87-4af1-a790-173866c1263c-proxy-tls\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259744 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eec32d83-8cb7-4573-a0eb-045cb7df0458-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259775 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259793 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259809 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7glrj\" (UniqueName: \"kubernetes.io/projected/61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae-kube-api-access-7glrj\") pod \"control-plane-machine-set-operator-78cbb6b69f-bsdf6\" (UID: \"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259825 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trpgt\" (UniqueName: \"kubernetes.io/projected/7c3e34f2-4982-4638-8f87-92318d6105ea-kube-api-access-trpgt\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259839 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eec32d83-8cb7-4573-a0eb-045cb7df0458-srv-cert\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259854 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww5mf\" (UniqueName: \"kubernetes.io/projected/c11ccbf0-027c-45ae-b6ea-78b49ba17d3f-kube-api-access-ww5mf\") pod \"package-server-manager-789f6589d5-7q6pv\" (UID: \"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259884 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwdh\" (UniqueName: \"kubernetes.io/projected/e06cf97d-355e-437b-8853-5088922efb9f-kube-api-access-mnwdh\") pod \"migrator-59844c95c7-swpdg\" (UID: \"e06cf97d-355e-437b-8853-5088922efb9f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259920 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c3e34f2-4982-4638-8f87-92318d6105ea-tmpfs\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259945 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbblm\" (UniqueName: \"kubernetes.io/projected/fe37528e-77b6-4a4c-91e9-89674636636e-kube-api-access-dbblm\") pod \"multus-admission-controller-857f4d67dd-qgbrh\" (UID: \"fe37528e-77b6-4a4c-91e9-89674636636e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259966 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a23456af-dc74-48f2-914b-84e7f6c549f1-srv-cert\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.259987 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-socket-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260020 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe37528e-77b6-4a4c-91e9-89674636636e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qgbrh\" (UID: \"fe37528e-77b6-4a4c-91e9-89674636636e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260047 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-registration-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260133 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/39a33410-5d77-47cc-a5c7-d41a8431dba0-certs\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260182 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vklf\" (UniqueName: \"kubernetes.io/projected/39a33410-5d77-47cc-a5c7-d41a8431dba0-kube-api-access-8vklf\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260221 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/214640b1-fa87-4af1-a790-173866c1263c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260257 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c3e34f2-4982-4638-8f87-92318d6105ea-webhook-cert\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260289 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvhz\" (UniqueName: \"kubernetes.io/projected/214640b1-fa87-4af1-a790-173866c1263c-kube-api-access-tfvhz\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260309 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fac2dbb-959e-4c17-993c-4e9593f00f99-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.260324 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxnw\" (UniqueName: \"kubernetes.io/projected/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-kube-api-access-tzxnw\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.260436 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.760416923 +0000 UTC m=+149.605377974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.269117 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wj872"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.279120 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k4gbz" Dec 03 12:16:00 crc kubenswrapper[4666]: W1203 12:16:00.290733 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06646f23_6df2_4308_9b72_fd7e108ad6e0.slice/crio-b1395eef2c6faeb979f644fae60afd1924ee15b7771ac0fabeea9373174c8c1f WatchSource:0}: Error finding container b1395eef2c6faeb979f644fae60afd1924ee15b7771ac0fabeea9373174c8c1f: Status 404 returned error can't find the container with id b1395eef2c6faeb979f644fae60afd1924ee15b7771ac0fabeea9373174c8c1f Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365272 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzxg9\" (UniqueName: \"kubernetes.io/projected/8b3e7281-d500-4cd3-bec9-daec25718f94-kube-api-access-xzxg9\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365319 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrzzn\" (UniqueName: \"kubernetes.io/projected/fb2d553f-f328-499a-a93a-f7d62e54f118-kube-api-access-hrzzn\") pod \"ingress-canary-nn69f\" (UID: \"fb2d553f-f328-499a-a93a-f7d62e54f118\") " pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365343 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsq7g\" (UniqueName: \"kubernetes.io/projected/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-kube-api-access-rsq7g\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365377 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c3e34f2-4982-4638-8f87-92318d6105ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365411 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3e7281-d500-4cd3-bec9-daec25718f94-config\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365444 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214640b1-fa87-4af1-a790-173866c1263c-proxy-tls\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365466 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eec32d83-8cb7-4573-a0eb-045cb7df0458-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365486 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c11ccbf0-027c-45ae-b6ea-78b49ba17d3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7q6pv\" (UID: \"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365511 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365530 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365548 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7glrj\" (UniqueName: \"kubernetes.io/projected/61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae-kube-api-access-7glrj\") pod \"control-plane-machine-set-operator-78cbb6b69f-bsdf6\" (UID: \"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365567 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trpgt\" (UniqueName: \"kubernetes.io/projected/7c3e34f2-4982-4638-8f87-92318d6105ea-kube-api-access-trpgt\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365583 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eec32d83-8cb7-4573-a0eb-045cb7df0458-srv-cert\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365596 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww5mf\" (UniqueName: \"kubernetes.io/projected/c11ccbf0-027c-45ae-b6ea-78b49ba17d3f-kube-api-access-ww5mf\") pod \"package-server-manager-789f6589d5-7q6pv\" (UID: \"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365624 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwdh\" (UniqueName: \"kubernetes.io/projected/e06cf97d-355e-437b-8853-5088922efb9f-kube-api-access-mnwdh\") pod \"migrator-59844c95c7-swpdg\" (UID: \"e06cf97d-355e-437b-8853-5088922efb9f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365668 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c3e34f2-4982-4638-8f87-92318d6105ea-tmpfs\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365696 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbblm\" (UniqueName: \"kubernetes.io/projected/fe37528e-77b6-4a4c-91e9-89674636636e-kube-api-access-dbblm\") pod \"multus-admission-controller-857f4d67dd-qgbrh\" (UID: \"fe37528e-77b6-4a4c-91e9-89674636636e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365724 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a23456af-dc74-48f2-914b-84e7f6c549f1-srv-cert\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365748 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-socket-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365774 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe37528e-77b6-4a4c-91e9-89674636636e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qgbrh\" (UID: \"fe37528e-77b6-4a4c-91e9-89674636636e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365805 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-registration-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365831 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/39a33410-5d77-47cc-a5c7-d41a8431dba0-certs\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365853 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vklf\" (UniqueName: \"kubernetes.io/projected/39a33410-5d77-47cc-a5c7-d41a8431dba0-kube-api-access-8vklf\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365878 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/214640b1-fa87-4af1-a790-173866c1263c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365901 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c3e34f2-4982-4638-8f87-92318d6105ea-webhook-cert\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365925 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvhz\" (UniqueName: \"kubernetes.io/projected/214640b1-fa87-4af1-a790-173866c1263c-kube-api-access-tfvhz\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365952 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fac2dbb-959e-4c17-993c-4e9593f00f99-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365973 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxnw\" (UniqueName: \"kubernetes.io/projected/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-kube-api-access-tzxnw\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.365999 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-signing-key\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366056 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/39a33410-5d77-47cc-a5c7-d41a8431dba0-node-bootstrap-token\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366076 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-secret-volume\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366113 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-csi-data-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366139 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmrk\" (UniqueName: \"kubernetes.io/projected/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-kube-api-access-9mmrk\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366158 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-config-volume\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366186 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3e7281-d500-4cd3-bec9-daec25718f94-serving-cert\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366211 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8lh9\" (UniqueName: \"kubernetes.io/projected/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-kube-api-access-b8lh9\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366233 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbf4j\" (UniqueName: \"kubernetes.io/projected/a23456af-dc74-48f2-914b-84e7f6c549f1-kube-api-access-zbf4j\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366289 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366313 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fac2dbb-959e-4c17-993c-4e9593f00f99-config\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366345 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bsdf6\" (UID: \"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366391 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366408 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-plugins-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366426 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-signing-cabundle\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366445 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-mountpoint-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366463 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fac2dbb-959e-4c17-993c-4e9593f00f99-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366478 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a23456af-dc74-48f2-914b-84e7f6c549f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366495 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb2d553f-f328-499a-a93a-f7d62e54f118-cert\") pod \"ingress-canary-nn69f\" (UID: \"fb2d553f-f328-499a-a93a-f7d62e54f118\") " pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366529 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/214640b1-fa87-4af1-a790-173866c1263c-images\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366550 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7kf8\" (UniqueName: \"kubernetes.io/projected/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-kube-api-access-m7kf8\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366571 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366590 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m55bt\" (UniqueName: \"kubernetes.io/projected/eec32d83-8cb7-4573-a0eb-045cb7df0458-kube-api-access-m55bt\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366835 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/214640b1-fa87-4af1-a790-173866c1263c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.366908 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-csi-data-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.370463 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c3e34f2-4982-4638-8f87-92318d6105ea-tmpfs\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.370692 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-registration-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.370786 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-socket-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.371501 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.370449 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3e7281-d500-4cd3-bec9-daec25718f94-config\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.371939 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.372796 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eec32d83-8cb7-4573-a0eb-045cb7df0458-srv-cert\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.374355 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/39a33410-5d77-47cc-a5c7-d41a8431dba0-certs\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.374477 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-config-volume\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.374786 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe37528e-77b6-4a4c-91e9-89674636636e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qgbrh\" (UID: \"fe37528e-77b6-4a4c-91e9-89674636636e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.376141 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-signing-cabundle\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.376773 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.876755515 +0000 UTC m=+149.721716566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.377366 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3e7281-d500-4cd3-bec9-daec25718f94-serving-cert\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.377466 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-mountpoint-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.377651 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fac2dbb-959e-4c17-993c-4e9593f00f99-config\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.378993 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-signing-key\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.379080 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/214640b1-fa87-4af1-a790-173866c1263c-images\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.380386 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bsdf6\" (UID: \"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.383636 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.383939 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-plugins-dir\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.389704 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb2d553f-f328-499a-a93a-f7d62e54f118-cert\") pod \"ingress-canary-nn69f\" (UID: \"fb2d553f-f328-499a-a93a-f7d62e54f118\") " pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.389929 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/39a33410-5d77-47cc-a5c7-d41a8431dba0-node-bootstrap-token\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.391803 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.391946 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c3e34f2-4982-4638-8f87-92318d6105ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.393624 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eec32d83-8cb7-4573-a0eb-045cb7df0458-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.393717 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rkcvk"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.394606 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fac2dbb-959e-4c17-993c-4e9593f00f99-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.394956 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a23456af-dc74-48f2-914b-84e7f6c549f1-srv-cert\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.395036 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214640b1-fa87-4af1-a790-173866c1263c-proxy-tls\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.395510 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a23456af-dc74-48f2-914b-84e7f6c549f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.395644 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.396221 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c3e34f2-4982-4638-8f87-92318d6105ea-webhook-cert\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.400152 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-secret-volume\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.400844 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsq7g\" (UniqueName: \"kubernetes.io/projected/992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6-kube-api-access-rsq7g\") pod \"csi-hostpathplugin-d42mb\" (UID: \"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6\") " pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.402679 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c11ccbf0-027c-45ae-b6ea-78b49ba17d3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7q6pv\" (UID: \"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.412222 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.422496 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.425148 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsm4g"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.429831 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzxg9\" (UniqueName: \"kubernetes.io/projected/8b3e7281-d500-4cd3-bec9-daec25718f94-kube-api-access-xzxg9\") pod \"service-ca-operator-777779d784-wqgrb\" (UID: \"8b3e7281-d500-4cd3-bec9-daec25718f94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.444962 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrzzn\" (UniqueName: \"kubernetes.io/projected/fb2d553f-f328-499a-a93a-f7d62e54f118-kube-api-access-hrzzn\") pod \"ingress-canary-nn69f\" (UID: \"fb2d553f-f328-499a-a93a-f7d62e54f118\") " pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.457004 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m55bt\" (UniqueName: \"kubernetes.io/projected/eec32d83-8cb7-4573-a0eb-045cb7df0458-kube-api-access-m55bt\") pod \"olm-operator-6b444d44fb-7pw99\" (UID: \"eec32d83-8cb7-4573-a0eb-045cb7df0458\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.464999 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.468463 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.469385 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:00.969357019 +0000 UTC m=+149.814318070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.485951 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.485970 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmrk\" (UniqueName: \"kubernetes.io/projected/3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693-kube-api-access-9mmrk\") pod \"kube-storage-version-migrator-operator-b67b599dd-bjvpx\" (UID: \"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.497442 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7glrj\" (UniqueName: \"kubernetes.io/projected/61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae-kube-api-access-7glrj\") pod \"control-plane-machine-set-operator-78cbb6b69f-bsdf6\" (UID: \"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:16:00 crc kubenswrapper[4666]: W1203 12:16:00.517272 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4cdba64_e137_43f9_a7a9_ced14dde212e.slice/crio-8fa3411e56637e82b82a2b070a4332a8fc15e1835096e197fa29f5472e2b57fd WatchSource:0}: Error finding container 8fa3411e56637e82b82a2b070a4332a8fc15e1835096e197fa29f5472e2b57fd: Status 404 returned error can't find the container with id 8fa3411e56637e82b82a2b070a4332a8fc15e1835096e197fa29f5472e2b57fd Dec 03 12:16:00 crc kubenswrapper[4666]: W1203 12:16:00.517972 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9731be9_1a36_490c_a448_222468842a67.slice/crio-8ae6c0aaee3cee9f76b7f1d41793de6f82f4e904c1354fc6900d703f61fa6cba WatchSource:0}: Error finding container 8ae6c0aaee3cee9f76b7f1d41793de6f82f4e904c1354fc6900d703f61fa6cba: Status 404 returned error can't find the container with id 8ae6c0aaee3cee9f76b7f1d41793de6f82f4e904c1354fc6900d703f61fa6cba Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.522097 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trpgt\" (UniqueName: \"kubernetes.io/projected/7c3e34f2-4982-4638-8f87-92318d6105ea-kube-api-access-trpgt\") pod \"packageserver-d55dfcdfc-k9dxb\" (UID: \"7c3e34f2-4982-4638-8f87-92318d6105ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: W1203 12:16:00.528894 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb958399_7029_46f6_a41c_0cf7823af900.slice/crio-b0f03522b50de3f77d00e9622b2d1ea7de33afa8586a0c3682d2616a814675d3 WatchSource:0}: Error finding container b0f03522b50de3f77d00e9622b2d1ea7de33afa8586a0c3682d2616a814675d3: Status 404 returned error can't find the container with id b0f03522b50de3f77d00e9622b2d1ea7de33afa8586a0c3682d2616a814675d3 Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.536497 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwdh\" (UniqueName: \"kubernetes.io/projected/e06cf97d-355e-437b-8853-5088922efb9f-kube-api-access-mnwdh\") pod \"migrator-59844c95c7-swpdg\" (UID: \"e06cf97d-355e-437b-8853-5088922efb9f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.557430 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww5mf\" (UniqueName: \"kubernetes.io/projected/c11ccbf0-027c-45ae-b6ea-78b49ba17d3f-kube-api-access-ww5mf\") pod \"package-server-manager-789f6589d5-7q6pv\" (UID: \"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.562654 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.570835 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.573675 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.073651637 +0000 UTC m=+149.918612678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: W1203 12:16:00.574879 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-cf5be15655cb80dc62db81a84a2447adcc4fb2d6e8bfdea02b3f6a5230d7a46c WatchSource:0}: Error finding container cf5be15655cb80dc62db81a84a2447adcc4fb2d6e8bfdea02b3f6a5230d7a46c: Status 404 returned error can't find the container with id cf5be15655cb80dc62db81a84a2447adcc4fb2d6e8bfdea02b3f6a5230d7a46c Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.575510 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbblm\" (UniqueName: \"kubernetes.io/projected/fe37528e-77b6-4a4c-91e9-89674636636e-kube-api-access-dbblm\") pod \"multus-admission-controller-857f4d67dd-qgbrh\" (UID: \"fe37528e-77b6-4a4c-91e9-89674636636e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.598555 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vklf\" (UniqueName: \"kubernetes.io/projected/39a33410-5d77-47cc-a5c7-d41a8431dba0-kube-api-access-8vklf\") pod \"machine-config-server-kbcqm\" (UID: \"39a33410-5d77-47cc-a5c7-d41a8431dba0\") " pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.606529 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.623626 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8lh9\" (UniqueName: \"kubernetes.io/projected/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-kube-api-access-b8lh9\") pod \"marketplace-operator-79b997595-tgdb2\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.627785 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.652639 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.652898 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbf4j\" (UniqueName: \"kubernetes.io/projected/a23456af-dc74-48f2-914b-84e7f6c549f1-kube-api-access-zbf4j\") pod \"catalog-operator-68c6474976-rk7q2\" (UID: \"a23456af-dc74-48f2-914b-84e7f6c549f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.658333 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fac2dbb-959e-4c17-993c-4e9593f00f99-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zvxbz\" (UID: \"0fac2dbb-959e-4c17-993c-4e9593f00f99\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.663461 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.673365 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.681129 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.700430 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.700854 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.200833629 +0000 UTC m=+150.045794680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.700880 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxnw\" (UniqueName: \"kubernetes.io/projected/5dc119dc-2490-4f5e-b4e9-82c5f82004b0-kube-api-access-tzxnw\") pod \"service-ca-9c57cc56f-88qxb\" (UID: \"5dc119dc-2490-4f5e-b4e9-82c5f82004b0\") " pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.726956 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7kf8\" (UniqueName: \"kubernetes.io/projected/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-kube-api-access-m7kf8\") pod \"collect-profiles-29412735-vnxtv\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.744302 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nn69f" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.747252 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.769395 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvhz\" (UniqueName: \"kubernetes.io/projected/214640b1-fa87-4af1-a790-173866c1263c-kube-api-access-tfvhz\") pod \"machine-config-operator-74547568cd-tggsq\" (UID: \"214640b1-fa87-4af1-a790-173866c1263c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.772489 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d3672499acc5a86a7907a8cfdc1299617cdca6a91960000bad3b4b87b86668f4"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.790319 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.790352 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kbcqm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.793581 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" event={"ID":"7583101e-f814-41d1-9b78-086c48e16385","Type":"ContainerStarted","Data":"f893e7755991ae31c02a8e139f24f5295ef678fbc02109da5eca9af8791eaa1d"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.794948 4666 generic.go:334] "Generic (PLEG): container finished" podID="95b963c3-6c15-49a4-9e37-2e16d825e46f" containerID="0528639e0ef3533e13fe2384d1cf942673528d720371bd7ee20d15608551d48d" exitCode=0 Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.795052 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" event={"ID":"95b963c3-6c15-49a4-9e37-2e16d825e46f","Type":"ContainerDied","Data":"0528639e0ef3533e13fe2384d1cf942673528d720371bd7ee20d15608551d48d"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.799494 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" event={"ID":"95b963c3-6c15-49a4-9e37-2e16d825e46f","Type":"ContainerStarted","Data":"1979ac674128b59c18266ba79ae9712d432c2039d23a8edce78ad1acdfc2da7d"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.802363 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.802713 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.30269884 +0000 UTC m=+150.147659891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.815462 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" event={"ID":"e6bf85a8-fe97-4a33-9441-04cb4c949118","Type":"ContainerStarted","Data":"fa37fd29d678df7992aa7c29b490abb4fabe3402a293e19510a4124a1213f18c"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.815515 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" event={"ID":"e6bf85a8-fe97-4a33-9441-04cb4c949118","Type":"ContainerStarted","Data":"003b793d369fdad0af1bf6d9287d3607ea10052a475964e61d10bbaac599b5f6"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.817320 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" event={"ID":"2cddd4e1-3283-4f65-a1bb-68d449471280","Type":"ContainerStarted","Data":"5e74f7f4dabce637c9b42895ef182cad1483991b64443d5d2b16f5b3a6141803"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.820231 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" event={"ID":"bb958399-7029-46f6-a41c-0cf7823af900","Type":"ContainerStarted","Data":"b0f03522b50de3f77d00e9622b2d1ea7de33afa8586a0c3682d2616a814675d3"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.820942 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" event={"ID":"b9731be9-1a36-490c-a448-222468842a67","Type":"ContainerStarted","Data":"8ae6c0aaee3cee9f76b7f1d41793de6f82f4e904c1354fc6900d703f61fa6cba"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.821992 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wj872" event={"ID":"06646f23-6df2-4308-9b72-fd7e108ad6e0","Type":"ContainerStarted","Data":"b1395eef2c6faeb979f644fae60afd1924ee15b7771ac0fabeea9373174c8c1f"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.849865 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cf5be15655cb80dc62db81a84a2447adcc4fb2d6e8bfdea02b3f6a5230d7a46c"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.850921 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.867970 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.883717 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.900073 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.903857 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:00 crc kubenswrapper[4666]: E1203 12:16:00.905313 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.405290501 +0000 UTC m=+150.250251542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.916279 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.918931 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5"] Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.927378 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxcq5" event={"ID":"b0b79044-b1ee-45fe-8b35-e9fc44f47e46","Type":"ContainerStarted","Data":"2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.927424 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxcq5" event={"ID":"b0b79044-b1ee-45fe-8b35-e9fc44f47e46","Type":"ContainerStarted","Data":"e93d74602875117c1d4cf974caa522c62ea74e6070ffd04bbb5025ad8970e2c9"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.931637 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" event={"ID":"5c58a7e7-a332-4dd3-b7ed-cbaea0826134","Type":"ContainerStarted","Data":"e275be2fc7c0d168ae5f0628bea83c6577a60fbd50346d48eb64c52810279d9b"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.936363 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" event={"ID":"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc","Type":"ContainerStarted","Data":"07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.936415 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" event={"ID":"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc","Type":"ContainerStarted","Data":"9eeaa989eb5c6ae64cb7f0f19578bd8a048aabfa330264835fdec194399cd4a4"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.936635 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.942079 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.946820 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.952800 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0847c4613c0b1051874d1b479b2616d6eb6629f27d37a3ab916b8a4c3a45e934"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.953439 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.956017 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" event={"ID":"7d685477-11f2-4bfb-98c2-6eb76b6697c3","Type":"ContainerStarted","Data":"b1353d447b307b37e5dc13f34dbb6ed4324d6a529f6f12f9da8ff0178fd69be0"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.956053 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" event={"ID":"7d685477-11f2-4bfb-98c2-6eb76b6697c3","Type":"ContainerStarted","Data":"57f4e88be3c98210b5247786ea09c449b7a4175a81b12f1b98f5644880201ba9"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.970233 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" event={"ID":"aa39afc3-04ae-42c9-b042-15a136a64fb4","Type":"ContainerStarted","Data":"7a7a557c59724348b6ba8147291ec15bd7d95f8933cf4d3c49c73a59db67ed51"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.970273 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" event={"ID":"aa39afc3-04ae-42c9-b042-15a136a64fb4","Type":"ContainerStarted","Data":"4a52ee67d2ab7a9b9aa593797f35bdafdc4f60654501ef8a2fdf93f1e47301ca"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.997652 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lts42" event={"ID":"6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca","Type":"ContainerStarted","Data":"8fbdd0855a2f8450fc1061bdcf1a8e3b1e0804e3a554f94b40f8cb7146ccf0f3"} Dec 03 12:16:00 crc kubenswrapper[4666]: I1203 12:16:00.997726 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.000182 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" event={"ID":"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa","Type":"ContainerStarted","Data":"242ff90bbd19fb1f21025584bada83037bc06fec379faca79d87f213d8988d65"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.006468 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.007944 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.507928623 +0000 UTC m=+150.352889674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.008542 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.008598 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.058688 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" event={"ID":"86ecf1c7-b9b3-49b1-9b6a-421e96100984","Type":"ContainerStarted","Data":"e249ebc98cc61388abbb73a8238e424ef39ab76c39c519fb352784439b061c20"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.058758 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" event={"ID":"86ecf1c7-b9b3-49b1-9b6a-421e96100984","Type":"ContainerStarted","Data":"14bb85ccc3469a495146ff6ed21d87308e3a23885fae51cf597fff6211b4effb"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.104021 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fgm9v" event={"ID":"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581","Type":"ContainerStarted","Data":"e19d47c3bfd50f0e22c89d6171ed59305bae3845c5b5df3dddcfdef00b37a530"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.109323 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.111143 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.61110653 +0000 UTC m=+150.456067581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.123564 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" event={"ID":"f4cdba64-e137-43f9-a7a9-ced14dde212e","Type":"ContainerStarted","Data":"8fa3411e56637e82b82a2b070a4332a8fc15e1835096e197fa29f5472e2b57fd"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.144451 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rxcq5" podStartSLOduration=131.144428113 podStartE2EDuration="2m11.144428113s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:01.081205892 +0000 UTC m=+149.926166943" watchObservedRunningTime="2025-12-03 12:16:01.144428113 +0000 UTC m=+149.989389174" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.184019 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" event={"ID":"f45da577-5ce5-4221-a450-d12a58efb053","Type":"ContainerStarted","Data":"bbd22185b8081cc107707429df59f83e458945e8fe3aee349a07897ce26db107"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.220780 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.221846 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.721812976 +0000 UTC m=+150.566774027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.288808 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" event={"ID":"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5","Type":"ContainerStarted","Data":"b3c0c5eccbbbf11fa838f9c11c619e182ab9567589f830171fdc65bc717e00df"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.288862 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" event={"ID":"9f4cea0f-fee6-42c3-9aee-7ea22fe63bf5","Type":"ContainerStarted","Data":"1fa0462aab42ed608c3eeb1dabb188fb48e9605b20536c1a0e83a965ca0f3bfe"} Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.324217 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k4gbz"] Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.325685 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.326003 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.82594542 +0000 UTC m=+150.670906461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.326255 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.326641 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.826634109 +0000 UTC m=+150.671595160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.345901 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-56n9f" podStartSLOduration=131.345882932 podStartE2EDuration="2m11.345882932s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:01.342720684 +0000 UTC m=+150.187681735" watchObservedRunningTime="2025-12-03 12:16:01.345882932 +0000 UTC m=+150.190843983" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.349394 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.361394 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx"] Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.431739 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.432997 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:01.932932493 +0000 UTC m=+150.777893544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: W1203 12:16:01.480376 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod551310dc_b537_4268_a72a_899169944815.slice/crio-ee62f74fda2febb58257a5ad0343e152977194560f68a0b5b61c68986dbc675a WatchSource:0}: Error finding container ee62f74fda2febb58257a5ad0343e152977194560f68a0b5b61c68986dbc675a: Status 404 returned error can't find the container with id ee62f74fda2febb58257a5ad0343e152977194560f68a0b5b61c68986dbc675a Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.543799 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.544995 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.044979215 +0000 UTC m=+150.889940266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.603156 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv"] Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.621560 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d"] Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.647108 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.647566 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.147544136 +0000 UTC m=+150.992505187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.749974 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.750400 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.250385053 +0000 UTC m=+151.095346104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.818151 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlpmn" podStartSLOduration=131.818124299 podStartE2EDuration="2m11.818124299s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:01.809105279 +0000 UTC m=+150.654066330" watchObservedRunningTime="2025-12-03 12:16:01.818124299 +0000 UTC m=+150.663085350" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.854317 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.854683 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.354661551 +0000 UTC m=+151.199622602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.881658 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lts42" podStartSLOduration=131.881634638 podStartE2EDuration="2m11.881634638s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:01.878934063 +0000 UTC m=+150.723895134" watchObservedRunningTime="2025-12-03 12:16:01.881634638 +0000 UTC m=+150.726595689" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.934111 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" podStartSLOduration=131.93406996 podStartE2EDuration="2m11.93406996s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:01.932779034 +0000 UTC m=+150.777740105" watchObservedRunningTime="2025-12-03 12:16:01.93406996 +0000 UTC m=+150.779031021" Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.945014 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d42mb"] Dec 03 12:16:01 crc kubenswrapper[4666]: I1203 12:16:01.956451 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:01 crc kubenswrapper[4666]: E1203 12:16:01.956918 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.456902092 +0000 UTC m=+151.301863143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.057911 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.058338 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.55831744 +0000 UTC m=+151.403278491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.123404 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb"] Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.164342 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.164790 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.664774798 +0000 UTC m=+151.509735849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.282811 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.283484 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.783460105 +0000 UTC m=+151.628421156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.384851 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.385285 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.885268184 +0000 UTC m=+151.730229235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.407016 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fgm9v" event={"ID":"96e5f9ae-3c4b-4016-b1d5-c6a1a1326581","Type":"ContainerStarted","Data":"37aa2237b951ebdc72888903442b156fc50a4cc82341cfa4753bdf06efe43bb7"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.482062 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" event={"ID":"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693","Type":"ContainerStarted","Data":"000a57043f4f33334792af4e47fb3f85e61eace93efaea68aff20cdf59fc55ae"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.486300 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.487725 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:02.9876957 +0000 UTC m=+151.832656931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.509613 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wj872" event={"ID":"06646f23-6df2-4308-9b72-fd7e108ad6e0","Type":"ContainerStarted","Data":"88cbc7ffbc9404a5d0b96d7b32097403d20eed2172ceeb1d94e7a566bfa27991"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.527888 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.554444 4666 patch_prober.go:28] interesting pod/console-operator-58897d9998-wj872 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.554527 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wj872" podUID="06646f23-6df2-4308-9b72-fd7e108ad6e0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.555754 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" event={"ID":"eadd4d63-6584-4b28-a233-8274a7941462","Type":"ContainerStarted","Data":"a3b51d147c541137351f1e5db04f98d56b94ea6500b82efcc55c531921c3f49a"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.605676 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.606324 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.106307104 +0000 UTC m=+151.951268155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.608361 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" event={"ID":"6a60617f-4a55-4818-a37a-2d745a296b97","Type":"ContainerStarted","Data":"a1bf2bcd434671917b6096e709c9308da39f9f64d1d1798adcb9219db71ce326"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.610236 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" event={"ID":"7d685477-11f2-4bfb-98c2-6eb76b6697c3","Type":"ContainerStarted","Data":"8ac9a4a4d2529f5613c03fd4e5a7298185aaff57a8dba42ff2a641897f5f1f52"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.612074 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"30b3b3a83fada24b2622772aeec9ec95b550221bd726b84e4ac35b9d348fc942"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.660489 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" event={"ID":"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6","Type":"ContainerStarted","Data":"5d70f91d025a16c471a8935a0d2d1c608299322717589f107d1b8d4dfc7e3683"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.682563 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e6842032cfa0decee48cfb67ef73ab64f775dd86e232a201786623b8d34f0e24"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.710395 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.711864 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.211839497 +0000 UTC m=+152.056800558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.769280 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" event={"ID":"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2","Type":"ContainerStarted","Data":"b511af46e93de431b13d4502a98c91d173ce470b1a98eb03a9c5d872676ab1c4"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.777598 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"351d1c66f19503364380da8d96f5517bcb54aaabcf99f3e7b01ef1a60bb971b3"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.802244 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" event={"ID":"7583101e-f814-41d1-9b78-086c48e16385","Type":"ContainerStarted","Data":"5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627"} Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.803517 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.817834 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.818393 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fgm9v" podStartSLOduration=132.818368357 podStartE2EDuration="2m12.818368357s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:02.735135002 +0000 UTC m=+151.580096053" watchObservedRunningTime="2025-12-03 12:16:02.818368357 +0000 UTC m=+151.663329428" Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.819167 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.319153469 +0000 UTC m=+152.164114520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.832945 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb"] Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.835379 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8gr95" podStartSLOduration=132.835363218 podStartE2EDuration="2m12.835363218s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:02.833578678 +0000 UTC m=+151.678539729" watchObservedRunningTime="2025-12-03 12:16:02.835363218 +0000 UTC m=+151.680324259" Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.854478 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.921891 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.922667 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.422631884 +0000 UTC m=+152.267592935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.932751 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:02 crc kubenswrapper[4666]: E1203 12:16:02.934742 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.434724519 +0000 UTC m=+152.279685570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:02 crc kubenswrapper[4666]: I1203 12:16:02.939902 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" event={"ID":"f45da577-5ce5-4221-a450-d12a58efb053","Type":"ContainerStarted","Data":"4bd2779ba27b6c17266e19467739abd4d86fc84896050d4a06aad7324c14ca4e"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.022294 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kbcqm" event={"ID":"39a33410-5d77-47cc-a5c7-d41a8431dba0","Type":"ContainerStarted","Data":"c89b43ec04a0fc249ef0a4e54ecb511b34111e99eb044d17848335c0deb4ac98"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.035038 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.035690 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.535668294 +0000 UTC m=+152.380629345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.049102 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k4gbz" event={"ID":"551310dc-b537-4268-a72a-899169944815","Type":"ContainerStarted","Data":"ee62f74fda2febb58257a5ad0343e152977194560f68a0b5b61c68986dbc675a"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.068684 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" event={"ID":"779f8827-42c7-4e5a-a89b-ba23d7e11e14","Type":"ContainerStarted","Data":"96f9273089333ce7bb8d05390d98afb3f84895105ecf9ac75e6c6e6615c908ab"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.143271 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.143925 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.643908482 +0000 UTC m=+152.488869533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.153620 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" event={"ID":"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f","Type":"ContainerStarted","Data":"5b1a9e796098489b0f29a59f468ae6c01b6fdede63d377b405db8fe4d2c7bbd7"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.178900 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.221420 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.244802 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.245247 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.745226477 +0000 UTC m=+152.590187528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.267666 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lts42" event={"ID":"6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca","Type":"ContainerStarted","Data":"0465b448e2f7ec168c1428e7e3bea554e72f809acad36dca7d82852c282ba85c"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.269054 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.269442 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.303372 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:03 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:03 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:03 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.303442 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.316239 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qgbrh"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.340429 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" event={"ID":"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa","Type":"ContainerStarted","Data":"d8a1a7f60408ec28cfaf59c1891d1258b2da8c1038135efb64efcf504b22fcf4"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.346292 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.346788 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.846771249 +0000 UTC m=+152.691732300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.418516 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" event={"ID":"2cddd4e1-3283-4f65-a1bb-68d449471280","Type":"ContainerStarted","Data":"7f54431144412239a93c31c2db9dd4141cc55da84d65b6d9688a7dd7cd426b36"} Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.418683 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.450532 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.450952 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:03.950908063 +0000 UTC m=+152.795869114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.473012 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgdb2"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.489282 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nn69f"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.507902 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.511475 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-88qxb"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.532881 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.552388 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.556234 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.056199759 +0000 UTC m=+152.901160810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.575245 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-f4r9r" podStartSLOduration=133.575220916 podStartE2EDuration="2m13.575220916s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:03.550713217 +0000 UTC m=+152.395674288" watchObservedRunningTime="2025-12-03 12:16:03.575220916 +0000 UTC m=+152.420181977" Dec 03 12:16:03 crc kubenswrapper[4666]: W1203 12:16:03.628210 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dc119dc_2490_4f5e_b4e9_82c5f82004b0.slice/crio-7c689037ea58d228d20a61d15f64457af5762d46b131bab0a003c5a8a73952a0 WatchSource:0}: Error finding container 7c689037ea58d228d20a61d15f64457af5762d46b131bab0a003c5a8a73952a0: Status 404 returned error can't find the container with id 7c689037ea58d228d20a61d15f64457af5762d46b131bab0a003c5a8a73952a0 Dec 03 12:16:03 crc kubenswrapper[4666]: W1203 12:16:03.658632 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode06cf97d_355e_437b_8853_5088922efb9f.slice/crio-bded97e15a8c5da47cf15a2141cb91bd2eac80b745b6cb07f0f5bf53b230c6ea WatchSource:0}: Error finding container bded97e15a8c5da47cf15a2141cb91bd2eac80b745b6cb07f0f5bf53b230c6ea: Status 404 returned error can't find the container with id bded97e15a8c5da47cf15a2141cb91bd2eac80b745b6cb07f0f5bf53b230c6ea Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.659690 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.660056 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.160037574 +0000 UTC m=+153.004998625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.727876 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" podStartSLOduration=132.727857792 podStartE2EDuration="2m12.727857792s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:03.640633517 +0000 UTC m=+152.485594568" watchObservedRunningTime="2025-12-03 12:16:03.727857792 +0000 UTC m=+152.572818843" Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.754889 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.788797 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.789687 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.289654434 +0000 UTC m=+153.134615485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.812900 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.840909 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.879339 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv"] Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.900402 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.902279 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-dn4sx" podStartSLOduration=132.902249602 podStartE2EDuration="2m12.902249602s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:03.838559698 +0000 UTC m=+152.683520759" watchObservedRunningTime="2025-12-03 12:16:03.902249602 +0000 UTC m=+152.747210663" Dec 03 12:16:03 crc kubenswrapper[4666]: E1203 12:16:03.914969 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.414941103 +0000 UTC m=+153.259902154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.926447 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wj872" podStartSLOduration=133.926418081 podStartE2EDuration="2m13.926418081s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:03.917395471 +0000 UTC m=+152.762356522" watchObservedRunningTime="2025-12-03 12:16:03.926418081 +0000 UTC m=+152.771379132" Dec 03 12:16:03 crc kubenswrapper[4666]: I1203 12:16:03.975267 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" podStartSLOduration=133.975239703 podStartE2EDuration="2m13.975239703s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:03.971732426 +0000 UTC m=+152.816693477" watchObservedRunningTime="2025-12-03 12:16:03.975239703 +0000 UTC m=+152.820200774" Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.005308 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.005844 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.505825 +0000 UTC m=+153.350786051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.088426 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq"] Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.106468 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.107015 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.606991951 +0000 UTC m=+153.451953002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.208666 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.209198 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.709181081 +0000 UTC m=+153.554142132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.220917 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:04 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:04 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:04 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.221002 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.312578 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.313444 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.813416378 +0000 UTC m=+153.658377429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.416612 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.416955 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:04.916941114 +0000 UTC m=+153.761902165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.481283 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nn69f" event={"ID":"fb2d553f-f328-499a-a93a-f7d62e54f118","Type":"ContainerStarted","Data":"7ad43ae2dd483057e068019267a084de7afe9e31e25f537270baf32cd3432564"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.483365 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" event={"ID":"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae","Type":"ContainerStarted","Data":"9d2cae55d8f2d5346df0b6046a31283e41551cbc7fc72af71b52b58e5e5234cd"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.515119 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" event={"ID":"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2","Type":"ContainerStarted","Data":"393856e0e8ac61c625a3eb89377e521efa9ed0ff63b846f9be1110f0b0f65ad8"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.521726 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.522103 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.022071285 +0000 UTC m=+153.867032336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.575008 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" event={"ID":"029e08a8-b4d9-470b-9a9e-364f1a52fd2f","Type":"ContainerStarted","Data":"99ec72a642416d245b60105245b72a8458d3c26257d6d925cc08e4a98cf8e894"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.606491 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" event={"ID":"eec32d83-8cb7-4573-a0eb-045cb7df0458","Type":"ContainerStarted","Data":"d052959e68847c91822be2a34c87a617a273865f3f292856fa32388355d2a508"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.622965 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.623320 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.123302519 +0000 UTC m=+153.968263560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.646741 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" event={"ID":"6a60617f-4a55-4818-a37a-2d745a296b97","Type":"ContainerStarted","Data":"12c752e9c81cdd4524e002d00d25c73297f38bd5283eb81138232fe6a9b9a307"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.667573 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" event={"ID":"a23456af-dc74-48f2-914b-84e7f6c549f1","Type":"ContainerStarted","Data":"51d84bd5fba2a7404034649172c2e2c2b090b35928ed9dd47d19b5d79d22889a"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.676403 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" event={"ID":"0fac2dbb-959e-4c17-993c-4e9593f00f99","Type":"ContainerStarted","Data":"d57edb8f74b9bcdc7d9932568201f4fdabbf9f1fa324ff713c73ee4d9ebf18c6"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.692211 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" event={"ID":"5dc119dc-2490-4f5e-b4e9-82c5f82004b0","Type":"ContainerStarted","Data":"7c689037ea58d228d20a61d15f64457af5762d46b131bab0a003c5a8a73952a0"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.704001 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" event={"ID":"7c3e34f2-4982-4638-8f87-92318d6105ea","Type":"ContainerStarted","Data":"abae9632470da91ac9922bccfbec2d3de55bfbda5025ec61425f5269c34b1f38"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.714210 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" event={"ID":"214640b1-fa87-4af1-a790-173866c1263c","Type":"ContainerStarted","Data":"4a0dbc85f6d1fbc6503bb2dd1d6078cfd9891cb0280f3e50bf0e89d737501fa5"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.723746 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.724282 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.224258894 +0000 UTC m=+154.069219945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.724806 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" event={"ID":"95b963c3-6c15-49a4-9e37-2e16d825e46f","Type":"ContainerStarted","Data":"b0dcce0a3d9214929565d5d66cb2c1bab35a42e39a6f3638649815bbabe1d4d0"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.725150 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.762624 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8jjc5" podStartSLOduration=134.762607036 podStartE2EDuration="2m14.762607036s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:04.755082938 +0000 UTC m=+153.600043999" watchObservedRunningTime="2025-12-03 12:16:04.762607036 +0000 UTC m=+153.607568087" Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.771448 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" event={"ID":"f4cdba64-e137-43f9-a7a9-ced14dde212e","Type":"ContainerStarted","Data":"18572f478c04bdf1ae1370ce6eac618dc2471ae737d34443dc5b40406c94c82c"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.811337 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" event={"ID":"3d4c3bf9-0ee9-427b-9aa1-fbf9bddb8693","Type":"ContainerStarted","Data":"fbb2789b6e733ce234ef2de77ce91390fe2a9a2350f0f88474e0be5f5ec538e1"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.828414 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" event={"ID":"779f8827-42c7-4e5a-a89b-ba23d7e11e14","Type":"ContainerStarted","Data":"1d976dffe9c13fb613c78a4d2afb2f85256ac7d5dbf3932cce39dfc62d3c2b92"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.837763 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" event={"ID":"8b3e7281-d500-4cd3-bec9-daec25718f94","Type":"ContainerStarted","Data":"551a17bff36f8f273400bfb0b1ecfe812ccc88b0417e1b18ca463d2877090cb4"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.837798 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" event={"ID":"8b3e7281-d500-4cd3-bec9-daec25718f94","Type":"ContainerStarted","Data":"66d578f87557246c6d1f1ba1214d6cd02e990dee7b94c1967beea6488074f798"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.839499 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.841823 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.341802359 +0000 UTC m=+154.186763620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.851560 4666 generic.go:334] "Generic (PLEG): container finished" podID="bb958399-7029-46f6-a41c-0cf7823af900" containerID="443bc79f7dab1d6ed45d933d31dae1f7106a5275d5ba8f48e27e8163ea6ce255" exitCode=0 Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.851644 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" event={"ID":"bb958399-7029-46f6-a41c-0cf7823af900","Type":"ContainerDied","Data":"443bc79f7dab1d6ed45d933d31dae1f7106a5275d5ba8f48e27e8163ea6ce255"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.869513 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" podStartSLOduration=134.869485706 podStartE2EDuration="2m14.869485706s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:04.866938145 +0000 UTC m=+153.711899196" watchObservedRunningTime="2025-12-03 12:16:04.869485706 +0000 UTC m=+153.714446747" Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.902411 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" event={"ID":"e06cf97d-355e-437b-8853-5088922efb9f","Type":"ContainerStarted","Data":"bded97e15a8c5da47cf15a2141cb91bd2eac80b745b6cb07f0f5bf53b230c6ea"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.910258 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" event={"ID":"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f","Type":"ContainerStarted","Data":"29d78b80886f2adba8b222582f8baca4147ffe0f61a612f132eddaefff35b930"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.920020 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqgrb" podStartSLOduration=133.919993375 podStartE2EDuration="2m13.919993375s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:04.919133561 +0000 UTC m=+153.764094612" watchObservedRunningTime="2025-12-03 12:16:04.919993375 +0000 UTC m=+153.764954426" Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.920197 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" event={"ID":"eadd4d63-6584-4b28-a233-8274a7941462","Type":"ContainerStarted","Data":"81fc152e49ae3857186f79e267824e331523a7f10dcd130882d655160c3ce3bc"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.942697 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kbcqm" event={"ID":"39a33410-5d77-47cc-a5c7-d41a8431dba0","Type":"ContainerStarted","Data":"e5f91719d87b747409fef42a1cd8bcd91aef5edc03c96c71bed59b4d7bfe8806"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.943510 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:04 crc kubenswrapper[4666]: E1203 12:16:04.951177 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.451148207 +0000 UTC m=+154.296109258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.990076 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" event={"ID":"b87e23b4-7fdc-42d3-b940-906c38fbd4ce","Type":"ContainerStarted","Data":"90f995b216ab1a93eeaa3ca2cd345167f4f3a6ee64b306e4dbe0aae10e79c944"} Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.991667 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.992730 4666 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgdb2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 03 12:16:04 crc kubenswrapper[4666]: I1203 12:16:04.992766 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.031343 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" event={"ID":"5c58a7e7-a332-4dd3-b7ed-cbaea0826134","Type":"ContainerStarted","Data":"9f46c70117f74dd97eb84d47ee328facfcaba008ff7dd09bba1d9718f32ae4b0"} Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.055381 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.058026 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.558011547 +0000 UTC m=+154.402972598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.064098 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k4gbz" event={"ID":"551310dc-b537-4268-a72a-899169944815","Type":"ContainerStarted","Data":"42c988e8e119a9ded9e45063cb3827dc7f703756de39c9745109a7504bc79d27"} Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.080991 4666 generic.go:334] "Generic (PLEG): container finished" podID="b9731be9-1a36-490c-a448-222468842a67" containerID="4c99872592a7463270cbc82e2e165534233e5bb5c58352c3c1f2c099f00684f0" exitCode=0 Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.081058 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" event={"ID":"b9731be9-1a36-490c-a448-222468842a67","Type":"ContainerDied","Data":"4c99872592a7463270cbc82e2e165534233e5bb5c58352c3c1f2c099f00684f0"} Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.098287 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" event={"ID":"fe37528e-77b6-4a4c-91e9-89674636636e","Type":"ContainerStarted","Data":"6485a3171159f86835558b9610af8b0fab871ba46c92418409946af76f7f948c"} Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.115078 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rn69k" podStartSLOduration=134.115057596 podStartE2EDuration="2m14.115057596s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.06680252 +0000 UTC m=+153.911763581" watchObservedRunningTime="2025-12-03 12:16:05.115057596 +0000 UTC m=+153.960018637" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.166425 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.169836 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.668016003 +0000 UTC m=+154.512977054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.175986 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" podStartSLOduration=134.175960603 podStartE2EDuration="2m14.175960603s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.174189244 +0000 UTC m=+154.019150295" watchObservedRunningTime="2025-12-03 12:16:05.175960603 +0000 UTC m=+154.020921674" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.213163 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bjvpx" podStartSLOduration=134.213133492 podStartE2EDuration="2m14.213133492s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.11701191 +0000 UTC m=+153.961972981" watchObservedRunningTime="2025-12-03 12:16:05.213133492 +0000 UTC m=+154.058094543" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.235056 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" event={"ID":"f45da577-5ce5-4221-a450-d12a58efb053","Type":"ContainerStarted","Data":"163dce9088d604a857f1ffc691214ed3fb925de312050b876facadcc4fc8bacd"} Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.241132 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.241196 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.270003 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wj872" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.277118 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.278981 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.778965445 +0000 UTC m=+154.623926496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.288616 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:05 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:05 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:05 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.288711 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.335327 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" podStartSLOduration=134.335297765 podStartE2EDuration="2m14.335297765s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.233254339 +0000 UTC m=+154.078215390" watchObservedRunningTime="2025-12-03 12:16:05.335297765 +0000 UTC m=+154.180258816" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.347624 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fsm4g" podStartSLOduration=135.347593806 podStartE2EDuration="2m15.347593806s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.341333262 +0000 UTC m=+154.186294323" watchObservedRunningTime="2025-12-03 12:16:05.347593806 +0000 UTC m=+154.192554857" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.378713 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.379485 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.879451558 +0000 UTC m=+154.724412609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.379535 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.380887 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.880866347 +0000 UTC m=+154.725827408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.482615 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.482979 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:05.982956274 +0000 UTC m=+154.827917325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.534479 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kbcqm" podStartSLOduration=8.53445807 podStartE2EDuration="8.53445807s" podCreationTimestamp="2025-12-03 12:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.533520515 +0000 UTC m=+154.378481566" watchObservedRunningTime="2025-12-03 12:16:05.53445807 +0000 UTC m=+154.379419121" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.585656 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.586053 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.086040869 +0000 UTC m=+154.931001920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.603652 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5wd5d" podStartSLOduration=135.603627516 podStartE2EDuration="2m15.603627516s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.590373659 +0000 UTC m=+154.435334710" watchObservedRunningTime="2025-12-03 12:16:05.603627516 +0000 UTC m=+154.448588567" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.692501 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.692946 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.192929639 +0000 UTC m=+155.037890690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.749317 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8zltk" podStartSLOduration=135.74929427 podStartE2EDuration="2m15.74929427s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:05.738612324 +0000 UTC m=+154.583573375" watchObservedRunningTime="2025-12-03 12:16:05.74929427 +0000 UTC m=+154.594255321" Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.800328 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.800751 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.300735884 +0000 UTC m=+155.145696935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:05 crc kubenswrapper[4666]: I1203 12:16:05.901125 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:05 crc kubenswrapper[4666]: E1203 12:16:05.902017 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.401980868 +0000 UTC m=+155.246941919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.004235 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.004630 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.50461506 +0000 UTC m=+155.349576111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.105982 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.106270 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.606221493 +0000 UTC m=+155.451182554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.106715 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.107234 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.60722133 +0000 UTC m=+155.452182571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.207747 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.208008 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.70796338 +0000 UTC m=+155.552924431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.208170 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.208621 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.708601168 +0000 UTC m=+155.553562219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.222451 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:06 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:06 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:06 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.222539 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.241209 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" event={"ID":"f7d1dfeb-f488-4924-8d1c-dc8a32a124aa","Type":"ContainerStarted","Data":"7c51d28b01901a309e3b2f875c6e9b2411a04f1e5a263756e25a846cc99ddbbc"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.247381 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" event={"ID":"fe37528e-77b6-4a4c-91e9-89674636636e","Type":"ContainerStarted","Data":"c66047ea35e94a3145a4950d37d40bc5f332ba61a9cfd12d42a14c97342df40e"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.250639 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" event={"ID":"c11ccbf0-027c-45ae-b6ea-78b49ba17d3f","Type":"ContainerStarted","Data":"3f279856cae8ed18bc58e2a8446a4eb887f5ce77759389655e9198981c266cd1"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.250843 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.252899 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" event={"ID":"eec32d83-8cb7-4573-a0eb-045cb7df0458","Type":"ContainerStarted","Data":"33560af4c1e573397336e601595354ff01db3d602e1aca80bf83bd5a42b71ea0"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.253133 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.255203 4666 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7pw99 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.255264 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" podUID="eec32d83-8cb7-4573-a0eb-045cb7df0458" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.256080 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" event={"ID":"61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae","Type":"ContainerStarted","Data":"f1a3bc7b5626f91cb55ed08ccb13bf97705281412c744f7aedb68189723544f3"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.259268 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" event={"ID":"5dc119dc-2490-4f5e-b4e9-82c5f82004b0","Type":"ContainerStarted","Data":"7682ba5829cd34f5bb2499612267bf98a9d5dfc92ca21cf86aacd48d82391416"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.261792 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" event={"ID":"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6","Type":"ContainerStarted","Data":"4d61cdff6fab5e6e68f1ebb1b66ea4800efc978007337bde1a6de233abde79fa"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.263571 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vx2dk" event={"ID":"779f8827-42c7-4e5a-a89b-ba23d7e11e14","Type":"ContainerStarted","Data":"bf3d48ecf1e49dff0bf635152e55bbd34cb6efa8fced0824f11ec32260d153e1"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.267533 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" event={"ID":"a23456af-dc74-48f2-914b-84e7f6c549f1","Type":"ContainerStarted","Data":"797883ecdec3b5a2e6ff64fc1f1f64a6e2b14785df58d4e1266548bec206e2b9"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.267923 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.269106 4666 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rk7q2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.269164 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" podUID="a23456af-dc74-48f2-914b-84e7f6c549f1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.271493 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" event={"ID":"b87e23b4-7fdc-42d3-b940-906c38fbd4ce","Type":"ContainerStarted","Data":"c923133b9c6778c948c059975149403cd2850660aa03f5a47c31cf0121d57dcf"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.272256 4666 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgdb2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.272310 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.274632 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" event={"ID":"7c3e34f2-4982-4638-8f87-92318d6105ea","Type":"ContainerStarted","Data":"953ff7d8538698879cb29f3dde68de1a86e901cb0559f5fd3afdbf69a818d5f6"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.275006 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.276232 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" event={"ID":"e06cf97d-355e-437b-8853-5088922efb9f","Type":"ContainerStarted","Data":"5eb6b7186eab85aaf270c78775114881eac5317517c81fc14f351df196947733"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.278305 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nn69f" event={"ID":"fb2d553f-f328-499a-a93a-f7d62e54f118","Type":"ContainerStarted","Data":"a6920c5d4e910be73cbaf26227272882ff2b3f15d50379f5ddab02a0a67533ee"} Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.288259 4666 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-k9dxb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.288341 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" podUID="7c3e34f2-4982-4638-8f87-92318d6105ea" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.292690 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-cm86d" podStartSLOduration=136.292669776 podStartE2EDuration="2m16.292669776s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:06.290953218 +0000 UTC m=+155.135914269" watchObservedRunningTime="2025-12-03 12:16:06.292669776 +0000 UTC m=+155.137630827" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.296819 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wjhvq" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.308764 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.314020 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.813989276 +0000 UTC m=+155.658950327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.377996 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-88qxb" podStartSLOduration=135.377972938 podStartE2EDuration="2m15.377972938s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:06.377801194 +0000 UTC m=+155.222762255" watchObservedRunningTime="2025-12-03 12:16:06.377972938 +0000 UTC m=+155.222933989" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.403548 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" podStartSLOduration=135.403522086 podStartE2EDuration="2m15.403522086s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:06.399211256 +0000 UTC m=+155.244172307" watchObservedRunningTime="2025-12-03 12:16:06.403522086 +0000 UTC m=+155.248483127" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.414160 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.414705 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:06.914683305 +0000 UTC m=+155.759644356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.443097 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nn69f" podStartSLOduration=9.443062271 podStartE2EDuration="9.443062271s" podCreationTimestamp="2025-12-03 12:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:06.442987949 +0000 UTC m=+155.287948990" watchObservedRunningTime="2025-12-03 12:16:06.443062271 +0000 UTC m=+155.288023322" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.484489 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" podStartSLOduration=135.484466127 podStartE2EDuration="2m15.484466127s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:06.464469523 +0000 UTC m=+155.309430574" watchObservedRunningTime="2025-12-03 12:16:06.484466127 +0000 UTC m=+155.329427178" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.503462 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" podStartSLOduration=135.503366791 podStartE2EDuration="2m15.503366791s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:06.483529831 +0000 UTC m=+155.328490882" watchObservedRunningTime="2025-12-03 12:16:06.503366791 +0000 UTC m=+155.348327842" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.504013 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" podStartSLOduration=135.504005978 podStartE2EDuration="2m15.504005978s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:06.501622772 +0000 UTC m=+155.346583823" watchObservedRunningTime="2025-12-03 12:16:06.504005978 +0000 UTC m=+155.348967029" Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.515426 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.515572 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.015542818 +0000 UTC m=+155.860503869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.515843 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.516374 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.01634847 +0000 UTC m=+155.861309521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.617472 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.617727 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.117691047 +0000 UTC m=+155.962652098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.617838 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.618247 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.118239172 +0000 UTC m=+155.963200233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.718801 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.719047 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.219009593 +0000 UTC m=+156.063970644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.719243 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.719585 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.219571938 +0000 UTC m=+156.064532989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.820145 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.820484 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.320466052 +0000 UTC m=+156.165427103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:06 crc kubenswrapper[4666]: I1203 12:16:06.924187 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:06 crc kubenswrapper[4666]: E1203 12:16:06.924649 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.424626737 +0000 UTC m=+156.269587788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.025569 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.025946 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.525912332 +0000 UTC m=+156.370873383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.026211 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.026708 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.526688674 +0000 UTC m=+156.371649725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.127861 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.128039 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.628009609 +0000 UTC m=+156.472970670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.128261 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.128694 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.628684668 +0000 UTC m=+156.473645719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.219187 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:07 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:07 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:07 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.219277 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.244004 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.244685 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.74466417 +0000 UTC m=+156.589625221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.295126 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" event={"ID":"bb958399-7029-46f6-a41c-0cf7823af900","Type":"ContainerStarted","Data":"c43f45d90366f6d81ebb2dc438bdb5f465bec366b9a04bc1a046b5671b92b13b"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.305781 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" event={"ID":"24aef896-5b2d-4c09-8230-ce7bd5b3a0f2","Type":"ContainerStarted","Data":"a26473be81298faa68638e56546c1adf0da5d4bd44d6ce4b789d08aaead41f41"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.309334 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" event={"ID":"b9731be9-1a36-490c-a448-222468842a67","Type":"ContainerStarted","Data":"3581db102ffc5fef03b07e6fa2f63fda01026de3049a1c18964469e21801d2ee"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.314193 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" event={"ID":"029e08a8-b4d9-470b-9a9e-364f1a52fd2f","Type":"ContainerStarted","Data":"efcc894c7944ffaa899f2f883b007fa6e92e6b111af861c688aef827db16fcc9"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.316182 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" event={"ID":"0fac2dbb-959e-4c17-993c-4e9593f00f99","Type":"ContainerStarted","Data":"60090694e36373d283e8282f1255e12fe2e9dd03003b87919dbc90bdc03991ea"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.333651 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k4gbz" event={"ID":"551310dc-b537-4268-a72a-899169944815","Type":"ContainerStarted","Data":"686fa3afbe6af3f2c478a023408eaa172cad982026054fedd06fed3f5d541d19"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.333748 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-k4gbz" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.338795 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" podStartSLOduration=136.338777136 podStartE2EDuration="2m16.338777136s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.335535416 +0000 UTC m=+156.180496467" watchObservedRunningTime="2025-12-03 12:16:07.338777136 +0000 UTC m=+156.183738187" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.340701 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" event={"ID":"214640b1-fa87-4af1-a790-173866c1263c","Type":"ContainerStarted","Data":"395075f1c9cb3fedeee943b3d882f79eaab987fe0f424815523afbfcf5933d3e"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.340766 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" event={"ID":"214640b1-fa87-4af1-a790-173866c1263c","Type":"ContainerStarted","Data":"65a618791484a5997d54a89b13a4e9e8ffe5884dff91d81e7ae8cb4eff1a478a"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.345702 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.347705 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.847688783 +0000 UTC m=+156.692649834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.348310 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" event={"ID":"e06cf97d-355e-437b-8853-5088922efb9f","Type":"ContainerStarted","Data":"6524bb00d681e7a5ba1150a1e85ef686f313a2381b02b95c7aab64eb6b715f21"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.354018 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" event={"ID":"fe37528e-77b6-4a4c-91e9-89674636636e","Type":"ContainerStarted","Data":"3c5c20df1cf426a8702efff63b152fc2f2fcd9a40ca9c2489fa2b97227404945"} Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.355936 4666 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7pw99 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.355994 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" podUID="eec32d83-8cb7-4573-a0eb-045cb7df0458" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.356021 4666 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgdb2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.356151 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.356411 4666 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rk7q2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.356434 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" podUID="a23456af-dc74-48f2-914b-84e7f6c549f1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.412124 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vz6p" podStartSLOduration=137.412102376 podStartE2EDuration="2m17.412102376s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.379446122 +0000 UTC m=+156.224407173" watchObservedRunningTime="2025-12-03 12:16:07.412102376 +0000 UTC m=+156.257063427" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.446819 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.448672 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:07.948646018 +0000 UTC m=+156.793607069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.470234 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k4gbz" podStartSLOduration=10.470211006 podStartE2EDuration="10.470211006s" podCreationTimestamp="2025-12-03 12:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.468153709 +0000 UTC m=+156.313114780" watchObservedRunningTime="2025-12-03 12:16:07.470211006 +0000 UTC m=+156.315172057" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.487992 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" podStartSLOduration=67.487960847 podStartE2EDuration="1m7.487960847s" podCreationTimestamp="2025-12-03 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.412012074 +0000 UTC m=+156.256973125" watchObservedRunningTime="2025-12-03 12:16:07.487960847 +0000 UTC m=+156.332921898" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.551547 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.551969 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.051951499 +0000 UTC m=+156.896912560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.614652 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zvxbz" podStartSLOduration=136.614629565 podStartE2EDuration="2m16.614629565s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.528554861 +0000 UTC m=+156.373515912" watchObservedRunningTime="2025-12-03 12:16:07.614629565 +0000 UTC m=+156.459590616" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.616863 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-swpdg" podStartSLOduration=136.616857166 podStartE2EDuration="2m16.616857166s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.615483298 +0000 UTC m=+156.460444349" watchObservedRunningTime="2025-12-03 12:16:07.616857166 +0000 UTC m=+156.461818217" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.656846 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.657199 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.157177973 +0000 UTC m=+157.002139014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.710630 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bsdf6" podStartSLOduration=136.710601052 podStartE2EDuration="2m16.710601052s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.698336073 +0000 UTC m=+156.543297134" watchObservedRunningTime="2025-12-03 12:16:07.710601052 +0000 UTC m=+156.555562103" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.711222 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-qgbrh" podStartSLOduration=136.711216419 podStartE2EDuration="2m16.711216419s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:07.648331308 +0000 UTC m=+156.493292349" watchObservedRunningTime="2025-12-03 12:16:07.711216419 +0000 UTC m=+156.556177470" Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.758819 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.759242 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.259225459 +0000 UTC m=+157.104186510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.972567 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.972786 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.472748822 +0000 UTC m=+157.317709873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:07 crc kubenswrapper[4666]: I1203 12:16:07.973287 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:07 crc kubenswrapper[4666]: E1203 12:16:07.973771 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.473751069 +0000 UTC m=+157.318712120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.086822 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:08 crc kubenswrapper[4666]: E1203 12:16:08.087076 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.587055967 +0000 UTC m=+157.432017018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.193534 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:08 crc kubenswrapper[4666]: E1203 12:16:08.194147 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.694114672 +0000 UTC m=+157.539075723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.226723 4666 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.231750 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:08 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:08 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:08 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.231847 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.295279 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:08 crc kubenswrapper[4666]: E1203 12:16:08.295758 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.795737016 +0000 UTC m=+157.640698067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.355798 4666 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-k9dxb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.355886 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" podUID="7c3e34f2-4982-4638-8f87-92318d6105ea" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.391867 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" event={"ID":"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6","Type":"ContainerStarted","Data":"6c78bcc983e5bebafa02090325cfb3ed9a6175af27aac508b38ee146b3c7cb24"} Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.400025 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.400433 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" event={"ID":"b9731be9-1a36-490c-a448-222468842a67","Type":"ContainerStarted","Data":"6ab06158871c2dde3298cf0b1959c913a21a27ef044d2eb52fba2ff993db08e5"} Dec 03 12:16:08 crc kubenswrapper[4666]: E1203 12:16:08.400587 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:08.900568959 +0000 UTC m=+157.745530010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.468137 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tggsq" podStartSLOduration=137.468116389 podStartE2EDuration="2m17.468116389s" podCreationTimestamp="2025-12-03 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:08.465997901 +0000 UTC m=+157.310958952" watchObservedRunningTime="2025-12-03 12:16:08.468116389 +0000 UTC m=+157.313077440" Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.501379 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:08 crc kubenswrapper[4666]: E1203 12:16:08.502819 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 12:16:09.00279793 +0000 UTC m=+157.847758981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.604383 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:08 crc kubenswrapper[4666]: E1203 12:16:08.604979 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 12:16:09.104946338 +0000 UTC m=+157.949907390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt6lp" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.617290 4666 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T12:16:08.226773136Z","Handler":null,"Name":""} Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.652578 4666 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.652984 4666 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.708712 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.752571 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.810118 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.858250 4666 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 12:16:08 crc kubenswrapper[4666]: I1203 12:16:08.858310 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.012952 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.013012 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.014965 4666 patch_prober.go:28] interesting pod/console-f9d7485db-rxcq5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.015024 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rxcq5" podUID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.102843 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.102929 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.102942 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.103013 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.224765 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:09 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:09 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:09 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.224831 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.228985 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.229064 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.230132 4666 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rkcvk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.31:8443/livez\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.230213 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" podUID="b9731be9-1a36-490c-a448-222468842a67" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.31:8443/livez\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.256292 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.256604 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.381387 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" podStartSLOduration=139.381359289 podStartE2EDuration="2m19.381359289s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:08.594576791 +0000 UTC m=+157.439537862" watchObservedRunningTime="2025-12-03 12:16:09.381359289 +0000 UTC m=+158.226320360" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.382796 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.383934 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.394259 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.394637 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.414737 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.422481 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.422580 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.458683 4666 generic.go:334] "Generic (PLEG): container finished" podID="029e08a8-b4d9-470b-9a9e-364f1a52fd2f" containerID="efcc894c7944ffaa899f2f883b007fa6e92e6b111af861c688aef827db16fcc9" exitCode=0 Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.483933 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.484580 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.484646 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bgzdr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.484659 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" event={"ID":"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6","Type":"ContainerStarted","Data":"79d4b4328b07346d791a6dce170dc818c8f895fa8ba347c382d738121bbc3e5f"} Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.484672 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" event={"ID":"992b2aa0-0bf6-4164-afa7-8bbadb4ef1a6","Type":"ContainerStarted","Data":"605e58d3baa23c28a6d023e889d0c4fc18d7217232b297d1fe0e8d694ecc1274"} Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.484683 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" event={"ID":"029e08a8-b4d9-470b-9a9e-364f1a52fd2f","Type":"ContainerDied","Data":"efcc894c7944ffaa899f2f883b007fa6e92e6b111af861c688aef827db16fcc9"} Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.495678 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt6lp\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.505964 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-d42mb" podStartSLOduration=12.505936308999999 podStartE2EDuration="12.505936309s" podCreationTimestamp="2025-12-03 12:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:09.498862033 +0000 UTC m=+158.343823094" watchObservedRunningTime="2025-12-03 12:16:09.505936309 +0000 UTC m=+158.350897370" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.524960 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.525051 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.525178 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.582121 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.733494 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5fq8n"] Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.734634 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.736729 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.737985 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.759731 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5fq8n"] Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.795786 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.839613 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-utilities\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.839664 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-catalog-content\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.839696 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9srb7\" (UniqueName: \"kubernetes.io/projected/ed8c4c35-0630-4f7c-aa9a-008a349c70db-kube-api-access-9srb7\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.870297 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.870383 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.924917 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fk2lr"] Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.926324 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.930925 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.946033 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fk2lr"] Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.957300 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-catalog-content\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.957394 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9srb7\" (UniqueName: \"kubernetes.io/projected/ed8c4c35-0630-4f7c-aa9a-008a349c70db-kube-api-access-9srb7\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.957457 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt5cx\" (UniqueName: \"kubernetes.io/projected/119ceb76-1272-4185-a39b-70203557b901-kube-api-access-tt5cx\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.957494 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-catalog-content\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.957529 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-utilities\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.957621 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-utilities\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.957915 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-catalog-content\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.958200 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-utilities\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:09 crc kubenswrapper[4666]: I1203 12:16:09.991572 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9srb7\" (UniqueName: \"kubernetes.io/projected/ed8c4c35-0630-4f7c-aa9a-008a349c70db-kube-api-access-9srb7\") pod \"community-operators-5fq8n\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.077846 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.078063 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt5cx\" (UniqueName: \"kubernetes.io/projected/119ceb76-1272-4185-a39b-70203557b901-kube-api-access-tt5cx\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.078167 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-catalog-content\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.078204 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-utilities\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.078741 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-utilities\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.079078 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-catalog-content\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.130592 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rpltp"] Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.134006 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt5cx\" (UniqueName: \"kubernetes.io/projected/119ceb76-1272-4185-a39b-70203557b901-kube-api-access-tt5cx\") pod \"certified-operators-fk2lr\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.134812 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.134668 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpltp"] Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.179664 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldn8\" (UniqueName: \"kubernetes.io/projected/130907ce-450a-4a73-92c6-aae2f3b2f850-kube-api-access-6ldn8\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.179726 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-catalog-content\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.179775 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-utilities\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.215114 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.229590 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:10 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:10 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:10 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.229661 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.285966 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-utilities\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.286458 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldn8\" (UniqueName: \"kubernetes.io/projected/130907ce-450a-4a73-92c6-aae2f3b2f850-kube-api-access-6ldn8\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.286493 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-catalog-content\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.293965 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-catalog-content\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.293997 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-utilities\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.297593 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.323406 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4knmd"] Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.336073 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldn8\" (UniqueName: \"kubernetes.io/projected/130907ce-450a-4a73-92c6-aae2f3b2f850-kube-api-access-6ldn8\") pod \"community-operators-rpltp\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.347217 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.354553 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4knmd"] Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.477474 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.479716 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7pw99" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.498496 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qccw\" (UniqueName: \"kubernetes.io/projected/de3ac985-b5ac-4afa-9d61-1837bbe36c50-kube-api-access-4qccw\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.498606 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-utilities\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.498670 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-catalog-content\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.584340 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k9dxb" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.600243 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qccw\" (UniqueName: \"kubernetes.io/projected/de3ac985-b5ac-4afa-9d61-1837bbe36c50-kube-api-access-4qccw\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.600328 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-utilities\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.600372 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-catalog-content\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.602115 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-utilities\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.602364 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-catalog-content\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.672876 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rk7q2" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.677707 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qccw\" (UniqueName: \"kubernetes.io/projected/de3ac985-b5ac-4afa-9d61-1837bbe36c50-kube-api-access-4qccw\") pod \"certified-operators-4knmd\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.802122 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.905626 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:16:10 crc kubenswrapper[4666]: I1203 12:16:10.966114 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.113300 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt6lp"] Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.220143 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:11 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:11 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:11 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.220461 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.251778 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5fq8n"] Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.260718 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpltp"] Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.386824 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fk2lr"] Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.398966 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:11 crc kubenswrapper[4666]: W1203 12:16:11.433041 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119ceb76_1272_4185_a39b_70203557b901.slice/crio-d4c34be7af90fb25ba9552fa7eb2d36573893a36a6ea661239d42da20c0d2450 WatchSource:0}: Error finding container d4c34be7af90fb25ba9552fa7eb2d36573893a36a6ea661239d42da20c0d2450: Status 404 returned error can't find the container with id d4c34be7af90fb25ba9552fa7eb2d36573893a36a6ea661239d42da20c0d2450 Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.503590 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4knmd"] Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.535434 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-config-volume\") pod \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.535504 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-secret-volume\") pod \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.535550 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7kf8\" (UniqueName: \"kubernetes.io/projected/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-kube-api-access-m7kf8\") pod \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\" (UID: \"029e08a8-b4d9-470b-9a9e-364f1a52fd2f\") " Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.537156 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-config-volume" (OuterVolumeSpecName: "config-volume") pod "029e08a8-b4d9-470b-9a9e-364f1a52fd2f" (UID: "029e08a8-b4d9-470b-9a9e-364f1a52fd2f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.555886 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-kube-api-access-m7kf8" (OuterVolumeSpecName: "kube-api-access-m7kf8") pod "029e08a8-b4d9-470b-9a9e-364f1a52fd2f" (UID: "029e08a8-b4d9-470b-9a9e-364f1a52fd2f"). InnerVolumeSpecName "kube-api-access-m7kf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.556770 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "029e08a8-b4d9-470b-9a9e-364f1a52fd2f" (UID: "029e08a8-b4d9-470b-9a9e-364f1a52fd2f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.661156 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk2lr" event={"ID":"119ceb76-1272-4185-a39b-70203557b901","Type":"ContainerStarted","Data":"d4c34be7af90fb25ba9552fa7eb2d36573893a36a6ea661239d42da20c0d2450"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.661216 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knmd" event={"ID":"de3ac985-b5ac-4afa-9d61-1837bbe36c50","Type":"ContainerStarted","Data":"c80cd7da4cda3524e8ea893e5aa1dd59041a015252053a7f79e0b67a731b5976"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.668015 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.668104 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.668117 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7kf8\" (UniqueName: \"kubernetes.io/projected/029e08a8-b4d9-470b-9a9e-364f1a52fd2f-kube-api-access-m7kf8\") on node \"crc\" DevicePath \"\"" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.669920 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" event={"ID":"029e08a8-b4d9-470b-9a9e-364f1a52fd2f","Type":"ContainerDied","Data":"99ec72a642416d245b60105245b72a8458d3c26257d6d925cc08e4a98cf8e894"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.669959 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99ec72a642416d245b60105245b72a8458d3c26257d6d925cc08e4a98cf8e894" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.670072 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.677580 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7","Type":"ContainerStarted","Data":"41e2fd0a1f31d5c393ef341582f2229f528bae94355421f5424444d6f42786e5"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.679078 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" event={"ID":"09abca61-c4c9-4f0c-beb9-468eea7e3f95","Type":"ContainerStarted","Data":"2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.679115 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" event={"ID":"09abca61-c4c9-4f0c-beb9-468eea7e3f95","Type":"ContainerStarted","Data":"8d235e6a3a60d451a15cdd15fb91f72166bb89d52c81a84cd716e3622da061f4"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.681665 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.700156 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fq8n" event={"ID":"ed8c4c35-0630-4f7c-aa9a-008a349c70db","Type":"ContainerStarted","Data":"fe25a6399d799764b2638e12b9bd00bc19cc6178374c812c75c73c9034847859"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.716827 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerStarted","Data":"8628159afe7c9c8b1bd9c978d55856d8de5267c690050166ee036cfe00f5e132"} Dec 03 12:16:11 crc kubenswrapper[4666]: I1203 12:16:11.731230 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" podStartSLOduration=141.731210221 podStartE2EDuration="2m21.731210221s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:11.725445651 +0000 UTC m=+160.570406702" watchObservedRunningTime="2025-12-03 12:16:11.731210221 +0000 UTC m=+160.576171272" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.098919 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bnm99"] Dec 03 12:16:12 crc kubenswrapper[4666]: E1203 12:16:12.099259 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029e08a8-b4d9-470b-9a9e-364f1a52fd2f" containerName="collect-profiles" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.099277 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="029e08a8-b4d9-470b-9a9e-364f1a52fd2f" containerName="collect-profiles" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.099410 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="029e08a8-b4d9-470b-9a9e-364f1a52fd2f" containerName="collect-profiles" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.100731 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.105246 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.110877 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnm99"] Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.174285 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8b6q\" (UniqueName: \"kubernetes.io/projected/36c1a423-51ec-4cce-bb65-f397809c6848-kube-api-access-h8b6q\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.174529 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-catalog-content\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.174581 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-utilities\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.221696 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:12 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:12 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:12 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.222064 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.276183 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8b6q\" (UniqueName: \"kubernetes.io/projected/36c1a423-51ec-4cce-bb65-f397809c6848-kube-api-access-h8b6q\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.276308 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-catalog-content\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.276351 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-utilities\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.276901 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-utilities\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.276950 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-catalog-content\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.294026 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8b6q\" (UniqueName: \"kubernetes.io/projected/36c1a423-51ec-4cce-bb65-f397809c6848-kube-api-access-h8b6q\") pod \"redhat-marketplace-bnm99\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.414907 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.511904 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-879n8"] Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.513269 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.595658 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6l4t\" (UniqueName: \"kubernetes.io/projected/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-kube-api-access-m6l4t\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.595758 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-catalog-content\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.595815 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-utilities\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.607413 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-879n8"] Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.696935 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-catalog-content\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.697014 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-utilities\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.697057 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6l4t\" (UniqueName: \"kubernetes.io/projected/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-kube-api-access-m6l4t\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.698234 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-catalog-content\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.698775 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-utilities\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.725142 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6l4t\" (UniqueName: \"kubernetes.io/projected/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-kube-api-access-m6l4t\") pod \"redhat-marketplace-879n8\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.748953 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerStarted","Data":"0bc9e0b852fc187defefd7c50e273748fbb864b7de935401b62a95e3150c0bc2"} Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.770961 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7","Type":"ContainerStarted","Data":"27b0ec6eefbe1ea525b2977fca593516d04606b6e03aadabf964b130d12fa7ba"} Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.860682 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.920889 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v8lpw"] Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.922441 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.926795 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 12:16:12 crc kubenswrapper[4666]: I1203 12:16:12.933709 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8lpw"] Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.015840 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-utilities\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.015953 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-catalog-content\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.016240 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp99d\" (UniqueName: \"kubernetes.io/projected/614c2c74-3d6f-4930-8c3e-a1bd11714e03-kube-api-access-xp99d\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.026538 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnm99"] Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.101424 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zhntz"] Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.102653 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.109756 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhntz"] Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.117863 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp99d\" (UniqueName: \"kubernetes.io/projected/614c2c74-3d6f-4930-8c3e-a1bd11714e03-kube-api-access-xp99d\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.117948 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-utilities\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.118017 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-catalog-content\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.118569 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-catalog-content\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.120939 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-utilities\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.142577 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp99d\" (UniqueName: \"kubernetes.io/projected/614c2c74-3d6f-4930-8c3e-a1bd11714e03-kube-api-access-xp99d\") pod \"redhat-operators-v8lpw\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.218553 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:13 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:13 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:13 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.218634 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.225896 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-catalog-content\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.226127 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpgp\" (UniqueName: \"kubernetes.io/projected/f0c6678b-ef6b-4681-a187-cf69e14eff7e-kube-api-access-9qpgp\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.227108 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.227411 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-utilities\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.233509 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1889fa0a-c57e-4b03-884b-f096236b084b-metrics-certs\") pod \"network-metrics-daemon-s4f78\" (UID: \"1889fa0a-c57e-4b03-884b-f096236b084b\") " pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.236428 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-879n8"] Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.271780 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.328513 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-utilities\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.328585 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-catalog-content\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.328631 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpgp\" (UniqueName: \"kubernetes.io/projected/f0c6678b-ef6b-4681-a187-cf69e14eff7e-kube-api-access-9qpgp\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.329807 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-utilities\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.330233 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-catalog-content\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: W1203 12:16:13.340069 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2b99ee2_2db3_40df_81b9_a789a0f2a9ed.slice/crio-d03e7397312bf7634d325712d932035d0d96b069588fe63d73bfeb0d8b8fdc0c WatchSource:0}: Error finding container d03e7397312bf7634d325712d932035d0d96b069588fe63d73bfeb0d8b8fdc0c: Status 404 returned error can't find the container with id d03e7397312bf7634d325712d932035d0d96b069588fe63d73bfeb0d8b8fdc0c Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.358722 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpgp\" (UniqueName: \"kubernetes.io/projected/f0c6678b-ef6b-4681-a187-cf69e14eff7e-kube-api-access-9qpgp\") pod \"redhat-operators-zhntz\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.486398 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s4f78" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.641483 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.731074 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8lpw"] Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.761946 4666 generic.go:334] "Generic (PLEG): container finished" podID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerID="c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040" exitCode=0 Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.762023 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knmd" event={"ID":"de3ac985-b5ac-4afa-9d61-1837bbe36c50","Type":"ContainerDied","Data":"c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040"} Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.762838 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerStarted","Data":"eb89825d608f732e0321032267f35f3e642a80c0c08e1a378879dc6d3610bcb8"} Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.763505 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerStarted","Data":"d03e7397312bf7634d325712d932035d0d96b069588fe63d73bfeb0d8b8fdc0c"} Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.764738 4666 generic.go:334] "Generic (PLEG): container finished" podID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerID="30518dee452786ff5ed41ca036a3dfbf8ef227439a34736a0eb5525dd5c5d5db" exitCode=0 Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.764775 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fq8n" event={"ID":"ed8c4c35-0630-4f7c-aa9a-008a349c70db","Type":"ContainerDied","Data":"30518dee452786ff5ed41ca036a3dfbf8ef227439a34736a0eb5525dd5c5d5db"} Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.769569 4666 generic.go:334] "Generic (PLEG): container finished" podID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerID="0bc9e0b852fc187defefd7c50e273748fbb864b7de935401b62a95e3150c0bc2" exitCode=0 Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.769646 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerDied","Data":"0bc9e0b852fc187defefd7c50e273748fbb864b7de935401b62a95e3150c0bc2"} Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.772999 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.786209 4666 generic.go:334] "Generic (PLEG): container finished" podID="119ceb76-1272-4185-a39b-70203557b901" containerID="e6dd4f0cb038dc6184fbabe24938f3f05307fadecb38e53342863f6be0c9dd44" exitCode=0 Dec 03 12:16:13 crc kubenswrapper[4666]: I1203 12:16:13.786307 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk2lr" event={"ID":"119ceb76-1272-4185-a39b-70203557b901","Type":"ContainerDied","Data":"e6dd4f0cb038dc6184fbabe24938f3f05307fadecb38e53342863f6be0c9dd44"} Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.035241 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=5.035218633 podStartE2EDuration="5.035218633s" podCreationTimestamp="2025-12-03 12:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:13.900126102 +0000 UTC m=+162.745087173" watchObservedRunningTime="2025-12-03 12:16:14.035218633 +0000 UTC m=+162.880179684" Dec 03 12:16:14 crc kubenswrapper[4666]: W1203 12:16:14.043737 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1889fa0a_c57e_4b03_884b_f096236b084b.slice/crio-5bd7bf5530b885afb879d7b09bc04e1372a3482f5a4d3b3a175edeb9cbd1c208 WatchSource:0}: Error finding container 5bd7bf5530b885afb879d7b09bc04e1372a3482f5a4d3b3a175edeb9cbd1c208: Status 404 returned error can't find the container with id 5bd7bf5530b885afb879d7b09bc04e1372a3482f5a4d3b3a175edeb9cbd1c208 Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.053954 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s4f78"] Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.103352 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhntz"] Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.218111 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:14 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:14 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:14 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.218201 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.235734 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.247682 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rkcvk" Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.795305 4666 generic.go:334] "Generic (PLEG): container finished" podID="41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7" containerID="27b0ec6eefbe1ea525b2977fca593516d04606b6e03aadabf964b130d12fa7ba" exitCode=0 Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.795736 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7","Type":"ContainerDied","Data":"27b0ec6eefbe1ea525b2977fca593516d04606b6e03aadabf964b130d12fa7ba"} Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.803694 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s4f78" event={"ID":"1889fa0a-c57e-4b03-884b-f096236b084b","Type":"ContainerStarted","Data":"5bd7bf5530b885afb879d7b09bc04e1372a3482f5a4d3b3a175edeb9cbd1c208"} Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.807769 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerStarted","Data":"90c704b31ebbf588ec9d750c43d338c32d887bac674d21c203291030671672b0"} Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.809194 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhntz" event={"ID":"f0c6678b-ef6b-4681-a187-cf69e14eff7e","Type":"ContainerStarted","Data":"77d9a2f091371e41798dbe075d324a8edbf486e208d4e75ee98d1045b8b7d12d"} Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.811952 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerStarted","Data":"da8818fed816227dcd52937e3b12fc239c114fec457cb302be854cc77ce6eb8f"} Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.814864 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerStarted","Data":"a65ebf7b759cee83f7d93e781fb37794cf30a6fc537292c5b786b88599005488"} Dec 03 12:16:14 crc kubenswrapper[4666]: I1203 12:16:14.814895 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerStarted","Data":"85ed9793b1d0dbdc50edb6e7a0ed798c3fb9eead848c31c1b05d34caf45a0bd6"} Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.218263 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:15 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:15 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:15 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.218346 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.281374 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k4gbz" Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.823927 4666 generic.go:334] "Generic (PLEG): container finished" podID="36c1a423-51ec-4cce-bb65-f397809c6848" containerID="90c704b31ebbf588ec9d750c43d338c32d887bac674d21c203291030671672b0" exitCode=0 Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.824012 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerDied","Data":"90c704b31ebbf588ec9d750c43d338c32d887bac674d21c203291030671672b0"} Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.831830 4666 generic.go:334] "Generic (PLEG): container finished" podID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerID="8f708bb2bfc6a24a7c027ca32d3e42386078079d9b3f233088fe9a1a259227d1" exitCode=0 Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.831936 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhntz" event={"ID":"f0c6678b-ef6b-4681-a187-cf69e14eff7e","Type":"ContainerDied","Data":"8f708bb2bfc6a24a7c027ca32d3e42386078079d9b3f233088fe9a1a259227d1"} Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.841819 4666 generic.go:334] "Generic (PLEG): container finished" podID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerID="da8818fed816227dcd52937e3b12fc239c114fec457cb302be854cc77ce6eb8f" exitCode=0 Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.841920 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerDied","Data":"da8818fed816227dcd52937e3b12fc239c114fec457cb302be854cc77ce6eb8f"} Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.855651 4666 generic.go:334] "Generic (PLEG): container finished" podID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerID="a65ebf7b759cee83f7d93e781fb37794cf30a6fc537292c5b786b88599005488" exitCode=0 Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.855756 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerDied","Data":"a65ebf7b759cee83f7d93e781fb37794cf30a6fc537292c5b786b88599005488"} Dec 03 12:16:15 crc kubenswrapper[4666]: I1203 12:16:15.877342 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s4f78" event={"ID":"1889fa0a-c57e-4b03-884b-f096236b084b","Type":"ContainerStarted","Data":"2cb502acb46035c033f9ce95a8b4841483eb931280cb43d686b8d80349b8ba7a"} Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.005808 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.006976 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.011072 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.011536 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.016792 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.120522 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6536108-2ca8-4569-a2b7-879ce269b341-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.120628 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6536108-2ca8-4569-a2b7-879ce269b341-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.222517 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6536108-2ca8-4569-a2b7-879ce269b341-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.222646 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6536108-2ca8-4569-a2b7-879ce269b341-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.222758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6536108-2ca8-4569-a2b7-879ce269b341-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.226545 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:16 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:16 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:16 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.226646 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.265604 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6536108-2ca8-4569-a2b7-879ce269b341-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.333252 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.356983 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.529741 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kubelet-dir\") pod \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.529824 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kube-api-access\") pod \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\" (UID: \"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7\") " Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.532508 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7" (UID: "41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.547367 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7" (UID: "41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.631224 4666 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.631265 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:16:16 crc kubenswrapper[4666]: I1203 12:16:16.844124 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 12:16:16 crc kubenswrapper[4666]: W1203 12:16:16.867541 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda6536108_2ca8_4569_a2b7_879ce269b341.slice/crio-3270eef6eea70dbe9dfc7e898935b0b54ae6ed4f7b731e47041e171328f1bcea WatchSource:0}: Error finding container 3270eef6eea70dbe9dfc7e898935b0b54ae6ed4f7b731e47041e171328f1bcea: Status 404 returned error can't find the container with id 3270eef6eea70dbe9dfc7e898935b0b54ae6ed4f7b731e47041e171328f1bcea Dec 03 12:16:17 crc kubenswrapper[4666]: I1203 12:16:17.130681 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 12:16:17 crc kubenswrapper[4666]: I1203 12:16:17.131040 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7","Type":"ContainerDied","Data":"41e2fd0a1f31d5c393ef341582f2229f528bae94355421f5424444d6f42786e5"} Dec 03 12:16:17 crc kubenswrapper[4666]: I1203 12:16:17.131137 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41e2fd0a1f31d5c393ef341582f2229f528bae94355421f5424444d6f42786e5" Dec 03 12:16:17 crc kubenswrapper[4666]: I1203 12:16:17.148074 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s4f78" event={"ID":"1889fa0a-c57e-4b03-884b-f096236b084b","Type":"ContainerStarted","Data":"ff1bfcfad1eba5c58c72f528e0047557a7b9a7d7a2e8b2943dccd0a2b6786518"} Dec 03 12:16:17 crc kubenswrapper[4666]: I1203 12:16:17.152175 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a6536108-2ca8-4569-a2b7-879ce269b341","Type":"ContainerStarted","Data":"3270eef6eea70dbe9dfc7e898935b0b54ae6ed4f7b731e47041e171328f1bcea"} Dec 03 12:16:17 crc kubenswrapper[4666]: I1203 12:16:17.221000 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:17 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:17 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:17 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:17 crc kubenswrapper[4666]: I1203 12:16:17.221099 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:18 crc kubenswrapper[4666]: I1203 12:16:18.218659 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:18 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:18 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:18 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:18 crc kubenswrapper[4666]: I1203 12:16:18.219024 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.009567 4666 patch_prober.go:28] interesting pod/console-f9d7485db-rxcq5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.009625 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rxcq5" podUID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.104851 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.104930 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.107486 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.107566 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.207813 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a6536108-2ca8-4569-a2b7-879ce269b341","Type":"ContainerStarted","Data":"b2b7d5d4357cb24d631510e21497e321df2967121e0046646e690bffc2850410"} Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.221375 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:19 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:19 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:19 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.221454 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.249255 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.249223279 podStartE2EDuration="4.249223279s" podCreationTimestamp="2025-12-03 12:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:19.238968285 +0000 UTC m=+168.083929336" watchObservedRunningTime="2025-12-03 12:16:19.249223279 +0000 UTC m=+168.094184330" Dec 03 12:16:19 crc kubenswrapper[4666]: I1203 12:16:19.250116 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-s4f78" podStartSLOduration=149.250109603 podStartE2EDuration="2m29.250109603s" podCreationTimestamp="2025-12-03 12:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:16:17.206650755 +0000 UTC m=+166.051611846" watchObservedRunningTime="2025-12-03 12:16:19.250109603 +0000 UTC m=+168.095070654" Dec 03 12:16:20 crc kubenswrapper[4666]: I1203 12:16:20.220785 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:20 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:20 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:20 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:20 crc kubenswrapper[4666]: I1203 12:16:20.221416 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:20 crc kubenswrapper[4666]: I1203 12:16:20.245888 4666 generic.go:334] "Generic (PLEG): container finished" podID="a6536108-2ca8-4569-a2b7-879ce269b341" containerID="b2b7d5d4357cb24d631510e21497e321df2967121e0046646e690bffc2850410" exitCode=0 Dec 03 12:16:20 crc kubenswrapper[4666]: I1203 12:16:20.245949 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a6536108-2ca8-4569-a2b7-879ce269b341","Type":"ContainerDied","Data":"b2b7d5d4357cb24d631510e21497e321df2967121e0046646e690bffc2850410"} Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.221933 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 12:16:21 crc kubenswrapper[4666]: [-]has-synced failed: reason withheld Dec 03 12:16:21 crc kubenswrapper[4666]: [+]process-running ok Dec 03 12:16:21 crc kubenswrapper[4666]: healthz check failed Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.222062 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.830806 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.903840 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6536108-2ca8-4569-a2b7-879ce269b341-kubelet-dir\") pod \"a6536108-2ca8-4569-a2b7-879ce269b341\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.904000 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6536108-2ca8-4569-a2b7-879ce269b341-kube-api-access\") pod \"a6536108-2ca8-4569-a2b7-879ce269b341\" (UID: \"a6536108-2ca8-4569-a2b7-879ce269b341\") " Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.904019 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6536108-2ca8-4569-a2b7-879ce269b341-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a6536108-2ca8-4569-a2b7-879ce269b341" (UID: "a6536108-2ca8-4569-a2b7-879ce269b341"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.904593 4666 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6536108-2ca8-4569-a2b7-879ce269b341-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:16:21 crc kubenswrapper[4666]: I1203 12:16:21.916068 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6536108-2ca8-4569-a2b7-879ce269b341-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a6536108-2ca8-4569-a2b7-879ce269b341" (UID: "a6536108-2ca8-4569-a2b7-879ce269b341"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:16:22 crc kubenswrapper[4666]: I1203 12:16:22.006746 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6536108-2ca8-4569-a2b7-879ce269b341-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:16:22 crc kubenswrapper[4666]: I1203 12:16:22.231131 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:16:22 crc kubenswrapper[4666]: I1203 12:16:22.234254 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fgm9v" Dec 03 12:16:22 crc kubenswrapper[4666]: I1203 12:16:22.276329 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 12:16:22 crc kubenswrapper[4666]: I1203 12:16:22.280340 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a6536108-2ca8-4569-a2b7-879ce269b341","Type":"ContainerDied","Data":"3270eef6eea70dbe9dfc7e898935b0b54ae6ed4f7b731e47041e171328f1bcea"} Dec 03 12:16:22 crc kubenswrapper[4666]: I1203 12:16:22.280382 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3270eef6eea70dbe9dfc7e898935b0b54ae6ed4f7b731e47041e171328f1bcea" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.103729 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.103731 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.104368 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.104469 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.104538 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.105136 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.105230 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.105394 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"0465b448e2f7ec168c1428e7e3bea554e72f809acad36dca7d82852c282ba85c"} pod="openshift-console/downloads-7954f5f757-lts42" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.105520 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" containerID="cri-o://0465b448e2f7ec168c1428e7e3bea554e72f809acad36dca7d82852c282ba85c" gracePeriod=2 Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.536283 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.540672 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:16:29 crc kubenswrapper[4666]: I1203 12:16:29.858386 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:16:30 crc kubenswrapper[4666]: I1203 12:16:30.476560 4666 generic.go:334] "Generic (PLEG): container finished" podID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerID="0465b448e2f7ec168c1428e7e3bea554e72f809acad36dca7d82852c282ba85c" exitCode=0 Dec 03 12:16:30 crc kubenswrapper[4666]: I1203 12:16:30.476627 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lts42" event={"ID":"6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca","Type":"ContainerDied","Data":"0465b448e2f7ec168c1428e7e3bea554e72f809acad36dca7d82852c282ba85c"} Dec 03 12:16:39 crc kubenswrapper[4666]: I1203 12:16:39.104173 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:39 crc kubenswrapper[4666]: I1203 12:16:39.104962 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:39 crc kubenswrapper[4666]: I1203 12:16:39.866544 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:16:39 crc kubenswrapper[4666]: I1203 12:16:39.866633 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:16:40 crc kubenswrapper[4666]: I1203 12:16:40.659575 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7q6pv" Dec 03 12:16:41 crc kubenswrapper[4666]: I1203 12:16:41.256473 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:16:41 crc kubenswrapper[4666]: I1203 12:16:41.256883 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:16:43 crc kubenswrapper[4666]: I1203 12:16:43.275118 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 12:16:49 crc kubenswrapper[4666]: I1203 12:16:49.104026 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:49 crc kubenswrapper[4666]: I1203 12:16:49.106159 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.005018 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 12:16:51 crc kubenswrapper[4666]: E1203 12:16:51.005331 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6536108-2ca8-4569-a2b7-879ce269b341" containerName="pruner" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.005351 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6536108-2ca8-4569-a2b7-879ce269b341" containerName="pruner" Dec 03 12:16:51 crc kubenswrapper[4666]: E1203 12:16:51.005366 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7" containerName="pruner" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.005375 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7" containerName="pruner" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.005491 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d6cc4c-b7bc-4fcc-9aee-a74c613b7fa7" containerName="pruner" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.005513 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6536108-2ca8-4569-a2b7-879ce269b341" containerName="pruner" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.006003 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.009950 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.010387 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.016806 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.130951 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/578a3906-05b4-456c-853d-b9949ce41520-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.131154 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a3906-05b4-456c-853d-b9949ce41520-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.233384 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a3906-05b4-456c-853d-b9949ce41520-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.233670 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/578a3906-05b4-456c-853d-b9949ce41520-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.233904 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/578a3906-05b4-456c-853d-b9949ce41520-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.269042 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a3906-05b4-456c-853d-b9949ce41520-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:51 crc kubenswrapper[4666]: I1203 12:16:51.338574 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.200498 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.201672 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.221248 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.308669 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kubelet-dir\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.308739 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kube-api-access\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.308837 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-var-lock\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.410338 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kubelet-dir\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.410445 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kube-api-access\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.410507 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kubelet-dir\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.410573 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-var-lock\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.410529 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-var-lock\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.435383 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kube-api-access\") pod \"installer-9-crc\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:56 crc kubenswrapper[4666]: I1203 12:16:56.526709 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:16:59 crc kubenswrapper[4666]: I1203 12:16:59.102682 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:16:59 crc kubenswrapper[4666]: I1203 12:16:59.102807 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:17:09 crc kubenswrapper[4666]: I1203 12:17:09.102645 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:17:09 crc kubenswrapper[4666]: I1203 12:17:09.103632 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:17:09 crc kubenswrapper[4666]: I1203 12:17:09.865756 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:17:09 crc kubenswrapper[4666]: I1203 12:17:09.865837 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:17:09 crc kubenswrapper[4666]: I1203 12:17:09.865902 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:17:09 crc kubenswrapper[4666]: I1203 12:17:09.866679 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:17:09 crc kubenswrapper[4666]: I1203 12:17:09.866748 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08" gracePeriod=600 Dec 03 12:17:11 crc kubenswrapper[4666]: I1203 12:17:11.845761 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08" exitCode=0 Dec 03 12:17:11 crc kubenswrapper[4666]: I1203 12:17:11.845836 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08"} Dec 03 12:17:13 crc kubenswrapper[4666]: E1203 12:17:13.342075 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 12:17:13 crc kubenswrapper[4666]: E1203 12:17:13.342374 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp99d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v8lpw_openshift-marketplace(614c2c74-3d6f-4930-8c3e-a1bd11714e03): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:13 crc kubenswrapper[4666]: E1203 12:17:13.344220 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v8lpw" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" Dec 03 12:17:16 crc kubenswrapper[4666]: E1203 12:17:16.511774 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v8lpw" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" Dec 03 12:17:16 crc kubenswrapper[4666]: E1203 12:17:16.658137 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 12:17:16 crc kubenswrapper[4666]: E1203 12:17:16.658317 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ldn8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rpltp_openshift-marketplace(130907ce-450a-4a73-92c6-aae2f3b2f850): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:16 crc kubenswrapper[4666]: E1203 12:17:16.660321 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rpltp" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" Dec 03 12:17:17 crc kubenswrapper[4666]: E1203 12:17:17.916021 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rpltp" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" Dec 03 12:17:17 crc kubenswrapper[4666]: E1203 12:17:17.999051 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 12:17:17 crc kubenswrapper[4666]: E1203 12:17:17.999479 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h8b6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bnm99_openshift-marketplace(36c1a423-51ec-4cce-bb65-f397809c6848): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:18 crc kubenswrapper[4666]: E1203 12:17:18.000889 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bnm99" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" Dec 03 12:17:18 crc kubenswrapper[4666]: E1203 12:17:18.014046 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 12:17:18 crc kubenswrapper[4666]: E1203 12:17:18.014284 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9srb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5fq8n_openshift-marketplace(ed8c4c35-0630-4f7c-aa9a-008a349c70db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:18 crc kubenswrapper[4666]: E1203 12:17:18.015709 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5fq8n" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" Dec 03 12:17:18 crc kubenswrapper[4666]: E1203 12:17:18.036755 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 12:17:18 crc kubenswrapper[4666]: E1203 12:17:18.037005 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qpgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zhntz_openshift-marketplace(f0c6678b-ef6b-4681-a187-cf69e14eff7e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:18 crc kubenswrapper[4666]: E1203 12:17:18.038293 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zhntz" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" Dec 03 12:17:19 crc kubenswrapper[4666]: I1203 12:17:19.102795 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:17:19 crc kubenswrapper[4666]: I1203 12:17:19.102886 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.571598 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bnm99" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.572300 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zhntz" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.671257 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.671851 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qccw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4knmd_openshift-marketplace(de3ac985-b5ac-4afa-9d61-1837bbe36c50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.673074 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4knmd" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.681416 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.681637 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6l4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-879n8_openshift-marketplace(b2b99ee2-2db3-40df-81b9-a789a0f2a9ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.682900 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-879n8" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.735295 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.735862 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tt5cx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fk2lr_openshift-marketplace(119ceb76-1272-4185-a39b-70203557b901): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.737529 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fk2lr" podUID="119ceb76-1272-4185-a39b-70203557b901" Dec 03 12:17:19 crc kubenswrapper[4666]: E1203 12:17:19.905921 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-879n8" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.040378 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.117899 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.913216 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"f902d3820b1cdaa69a222508ac420a330a388c9eac4b12d3a57b516661f8fab0"} Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.915672 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lts42" event={"ID":"6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca","Type":"ContainerStarted","Data":"db4bd302c7812f259506aa01b79e235bcb6fba895312ca0f4b928fdf78bc0a6b"} Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.916062 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.916648 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.916699 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.917048 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"578a3906-05b4-456c-853d-b9949ce41520","Type":"ContainerStarted","Data":"a16fd3994f33353e054bfbb21587002f6b522f2ca5683b5aab3292c327df1c36"} Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.917104 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"578a3906-05b4-456c-853d-b9949ce41520","Type":"ContainerStarted","Data":"a24a40febb6cb1ef830453cb650284eccee5b124ac301b3e41a1ede915fc6caf"} Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.918250 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"efa8b052-a562-46f9-ab41-51aa3b3a0c39","Type":"ContainerStarted","Data":"b97d42f758719fab20094f775153909c7166b8d1ed529a66cab39f926cf44f89"} Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.918286 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"efa8b052-a562-46f9-ab41-51aa3b3a0c39","Type":"ContainerStarted","Data":"3fdeef9dacfb1cb0b79f5e5700b9c5ace87ee364644767259f9cbb7e30d010a9"} Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.951423 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=30.951400289 podStartE2EDuration="30.951400289s" podCreationTimestamp="2025-12-03 12:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:20.950189715 +0000 UTC m=+229.795150786" watchObservedRunningTime="2025-12-03 12:17:20.951400289 +0000 UTC m=+229.796361350" Dec 03 12:17:20 crc kubenswrapper[4666]: I1203 12:17:20.969373 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=24.969351826 podStartE2EDuration="24.969351826s" podCreationTimestamp="2025-12-03 12:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:17:20.965274613 +0000 UTC m=+229.810235664" watchObservedRunningTime="2025-12-03 12:17:20.969351826 +0000 UTC m=+229.814312867" Dec 03 12:17:21 crc kubenswrapper[4666]: I1203 12:17:21.927043 4666 generic.go:334] "Generic (PLEG): container finished" podID="578a3906-05b4-456c-853d-b9949ce41520" containerID="a16fd3994f33353e054bfbb21587002f6b522f2ca5683b5aab3292c327df1c36" exitCode=0 Dec 03 12:17:21 crc kubenswrapper[4666]: I1203 12:17:21.927132 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"578a3906-05b4-456c-853d-b9949ce41520","Type":"ContainerDied","Data":"a16fd3994f33353e054bfbb21587002f6b522f2ca5683b5aab3292c327df1c36"} Dec 03 12:17:21 crc kubenswrapper[4666]: I1203 12:17:21.929791 4666 patch_prober.go:28] interesting pod/downloads-7954f5f757-lts42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 03 12:17:21 crc kubenswrapper[4666]: I1203 12:17:21.929838 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lts42" podUID="6dbd4cf6-5fe7-4e07-a6ae-572a0b9126ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.221966 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.360693 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/578a3906-05b4-456c-853d-b9949ce41520-kubelet-dir\") pod \"578a3906-05b4-456c-853d-b9949ce41520\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.360862 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a3906-05b4-456c-853d-b9949ce41520-kube-api-access\") pod \"578a3906-05b4-456c-853d-b9949ce41520\" (UID: \"578a3906-05b4-456c-853d-b9949ce41520\") " Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.360873 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/578a3906-05b4-456c-853d-b9949ce41520-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "578a3906-05b4-456c-853d-b9949ce41520" (UID: "578a3906-05b4-456c-853d-b9949ce41520"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.361280 4666 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/578a3906-05b4-456c-853d-b9949ce41520-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.385286 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578a3906-05b4-456c-853d-b9949ce41520-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "578a3906-05b4-456c-853d-b9949ce41520" (UID: "578a3906-05b4-456c-853d-b9949ce41520"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.463055 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a3906-05b4-456c-853d-b9949ce41520-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.943081 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"578a3906-05b4-456c-853d-b9949ce41520","Type":"ContainerDied","Data":"a24a40febb6cb1ef830453cb650284eccee5b124ac301b3e41a1ede915fc6caf"} Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.943671 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24a40febb6cb1ef830453cb650284eccee5b124ac301b3e41a1ede915fc6caf" Dec 03 12:17:23 crc kubenswrapper[4666]: I1203 12:17:23.943163 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 12:17:29 crc kubenswrapper[4666]: I1203 12:17:29.124562 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lts42" Dec 03 12:17:31 crc kubenswrapper[4666]: I1203 12:17:31.256290 4666 patch_prober.go:28] interesting pod/router-default-5444994796-fgm9v container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:17:31 crc kubenswrapper[4666]: I1203 12:17:31.256372 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-fgm9v" podUID="96e5f9ae-3c4b-4016-b1d5-c6a1a1326581" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:17:46 crc kubenswrapper[4666]: I1203 12:17:46.094259 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhntz" event={"ID":"f0c6678b-ef6b-4681-a187-cf69e14eff7e","Type":"ContainerStarted","Data":"60fd7da638e8fdd9c19575a6c2bd8f8b64494fc1f870ddae123981281a61491a"} Dec 03 12:17:46 crc kubenswrapper[4666]: I1203 12:17:46.098932 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knmd" event={"ID":"de3ac985-b5ac-4afa-9d61-1837bbe36c50","Type":"ContainerStarted","Data":"238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23"} Dec 03 12:17:46 crc kubenswrapper[4666]: I1203 12:17:46.101316 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerStarted","Data":"1911aef9191e22eff76b2ed92cdeb31853bec6526d8dfdc763f4f2b012173dd1"} Dec 03 12:17:46 crc kubenswrapper[4666]: I1203 12:17:46.103540 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerStarted","Data":"4b8b1e97495270cf47fdd50174ab08dfb392811ea41649b90fb59b985dbec8b5"} Dec 03 12:17:46 crc kubenswrapper[4666]: I1203 12:17:46.105672 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fq8n" event={"ID":"ed8c4c35-0630-4f7c-aa9a-008a349c70db","Type":"ContainerStarted","Data":"76f6fae45936d11b2759b1f4bce57ffa0088334a6890072ff5fefd34b674395c"} Dec 03 12:17:46 crc kubenswrapper[4666]: I1203 12:17:46.107971 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerStarted","Data":"a5e25060d27ca2122d15a9f0a8c5c3e072d425099da71422b26121797869da53"} Dec 03 12:17:46 crc kubenswrapper[4666]: I1203 12:17:46.110418 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk2lr" event={"ID":"119ceb76-1272-4185-a39b-70203557b901","Type":"ContainerStarted","Data":"fe1f8b5e1aeea194af67442b29796e529ffa2866b234430c67084ecdc19b9821"} Dec 03 12:17:47 crc kubenswrapper[4666]: I1203 12:17:47.153605 4666 generic.go:334] "Generic (PLEG): container finished" podID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerID="76f6fae45936d11b2759b1f4bce57ffa0088334a6890072ff5fefd34b674395c" exitCode=0 Dec 03 12:17:47 crc kubenswrapper[4666]: I1203 12:17:47.154016 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fq8n" event={"ID":"ed8c4c35-0630-4f7c-aa9a-008a349c70db","Type":"ContainerDied","Data":"76f6fae45936d11b2759b1f4bce57ffa0088334a6890072ff5fefd34b674395c"} Dec 03 12:17:47 crc kubenswrapper[4666]: I1203 12:17:47.157707 4666 generic.go:334] "Generic (PLEG): container finished" podID="36c1a423-51ec-4cce-bb65-f397809c6848" containerID="1911aef9191e22eff76b2ed92cdeb31853bec6526d8dfdc763f4f2b012173dd1" exitCode=0 Dec 03 12:17:47 crc kubenswrapper[4666]: I1203 12:17:47.157772 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerDied","Data":"1911aef9191e22eff76b2ed92cdeb31853bec6526d8dfdc763f4f2b012173dd1"} Dec 03 12:17:47 crc kubenswrapper[4666]: I1203 12:17:47.181788 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerStarted","Data":"74f01736504d071e62ef7c36c133703555e6fda54ece6d9ca764bc4a5c8736b1"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.188955 4666 generic.go:334] "Generic (PLEG): container finished" podID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerID="238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23" exitCode=0 Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.189070 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knmd" event={"ID":"de3ac985-b5ac-4afa-9d61-1837bbe36c50","Type":"ContainerDied","Data":"238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.191742 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerStarted","Data":"948b2e0773007541791722ae090532dbe4db79eb4a21f7e164b7f3bd146eba23"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.198373 4666 generic.go:334] "Generic (PLEG): container finished" podID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerID="60fd7da638e8fdd9c19575a6c2bd8f8b64494fc1f870ddae123981281a61491a" exitCode=0 Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.198472 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhntz" event={"ID":"f0c6678b-ef6b-4681-a187-cf69e14eff7e","Type":"ContainerDied","Data":"60fd7da638e8fdd9c19575a6c2bd8f8b64494fc1f870ddae123981281a61491a"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.203045 4666 generic.go:334] "Generic (PLEG): container finished" podID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerID="74f01736504d071e62ef7c36c133703555e6fda54ece6d9ca764bc4a5c8736b1" exitCode=0 Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.203128 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerDied","Data":"74f01736504d071e62ef7c36c133703555e6fda54ece6d9ca764bc4a5c8736b1"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.205776 4666 generic.go:334] "Generic (PLEG): container finished" podID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerID="4b8b1e97495270cf47fdd50174ab08dfb392811ea41649b90fb59b985dbec8b5" exitCode=0 Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.205846 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerDied","Data":"4b8b1e97495270cf47fdd50174ab08dfb392811ea41649b90fb59b985dbec8b5"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.208910 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fq8n" event={"ID":"ed8c4c35-0630-4f7c-aa9a-008a349c70db","Type":"ContainerStarted","Data":"ab62ebe12477f6221e1bf443749213a8bc85e7829527e505948c1c1983273f77"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.215751 4666 generic.go:334] "Generic (PLEG): container finished" podID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerID="a5e25060d27ca2122d15a9f0a8c5c3e072d425099da71422b26121797869da53" exitCode=0 Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.215844 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerDied","Data":"a5e25060d27ca2122d15a9f0a8c5c3e072d425099da71422b26121797869da53"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.219544 4666 generic.go:334] "Generic (PLEG): container finished" podID="119ceb76-1272-4185-a39b-70203557b901" containerID="fe1f8b5e1aeea194af67442b29796e529ffa2866b234430c67084ecdc19b9821" exitCode=0 Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.219583 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk2lr" event={"ID":"119ceb76-1272-4185-a39b-70203557b901","Type":"ContainerDied","Data":"fe1f8b5e1aeea194af67442b29796e529ffa2866b234430c67084ecdc19b9821"} Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.231595 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5fq8n" podStartSLOduration=6.382957805 podStartE2EDuration="1m39.231556843s" podCreationTimestamp="2025-12-03 12:16:09 +0000 UTC" firstStartedPulling="2025-12-03 12:16:14.817029783 +0000 UTC m=+163.661990834" lastFinishedPulling="2025-12-03 12:17:47.665628781 +0000 UTC m=+256.510589872" observedRunningTime="2025-12-03 12:17:48.228528539 +0000 UTC m=+257.073489590" watchObservedRunningTime="2025-12-03 12:17:48.231556843 +0000 UTC m=+257.076517894" Dec 03 12:17:48 crc kubenswrapper[4666]: I1203 12:17:48.245119 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bnm99" podStartSLOduration=4.378822311 podStartE2EDuration="1m36.245077287s" podCreationTimestamp="2025-12-03 12:16:12 +0000 UTC" firstStartedPulling="2025-12-03 12:16:15.826161947 +0000 UTC m=+164.671122998" lastFinishedPulling="2025-12-03 12:17:47.692416923 +0000 UTC m=+256.537377974" observedRunningTime="2025-12-03 12:17:48.243643407 +0000 UTC m=+257.088604458" watchObservedRunningTime="2025-12-03 12:17:48.245077287 +0000 UTC m=+257.090038338" Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.229148 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerStarted","Data":"227ca731416a9a911095b3e13d60ef2dadf3c80107dfe1b163695ff006439bd8"} Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.231056 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk2lr" event={"ID":"119ceb76-1272-4185-a39b-70203557b901","Type":"ContainerStarted","Data":"c2c00e29009e1547db66580d47b73ebf5a3d61855e583e3381fd1476351413c7"} Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.234128 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knmd" event={"ID":"de3ac985-b5ac-4afa-9d61-1837bbe36c50","Type":"ContainerStarted","Data":"c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100"} Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.236532 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhntz" event={"ID":"f0c6678b-ef6b-4681-a187-cf69e14eff7e","Type":"ContainerStarted","Data":"ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701"} Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.239163 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerStarted","Data":"2179d2f6dfa7ee428ff292faa748cd8036c15949ad5e015e4b5f2d0a9ef2b4f0"} Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.241736 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerStarted","Data":"627a6258aab26ecd1ab5d7a76e6f947f8ec1bd67bbdc6bb181038151d583354f"} Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.257379 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rpltp" podStartSLOduration=4.178011627 podStartE2EDuration="1m39.257346699s" podCreationTimestamp="2025-12-03 12:16:10 +0000 UTC" firstStartedPulling="2025-12-03 12:16:13.772662892 +0000 UTC m=+162.617623963" lastFinishedPulling="2025-12-03 12:17:48.851997984 +0000 UTC m=+257.696959035" observedRunningTime="2025-12-03 12:17:49.253239385 +0000 UTC m=+258.098200426" watchObservedRunningTime="2025-12-03 12:17:49.257346699 +0000 UTC m=+258.102307750" Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.295118 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4knmd" podStartSLOduration=5.533706598 podStartE2EDuration="1m39.295085594s" podCreationTimestamp="2025-12-03 12:16:10 +0000 UTC" firstStartedPulling="2025-12-03 12:16:14.816622111 +0000 UTC m=+163.661583162" lastFinishedPulling="2025-12-03 12:17:48.578001107 +0000 UTC m=+257.422962158" observedRunningTime="2025-12-03 12:17:49.277429305 +0000 UTC m=+258.122390366" watchObservedRunningTime="2025-12-03 12:17:49.295085594 +0000 UTC m=+258.140046635" Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.296890 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zhntz" podStartSLOduration=3.43652075 podStartE2EDuration="1m36.296883884s" podCreationTimestamp="2025-12-03 12:16:13 +0000 UTC" firstStartedPulling="2025-12-03 12:16:15.839277491 +0000 UTC m=+164.684238542" lastFinishedPulling="2025-12-03 12:17:48.699640625 +0000 UTC m=+257.544601676" observedRunningTime="2025-12-03 12:17:49.292435481 +0000 UTC m=+258.137396542" watchObservedRunningTime="2025-12-03 12:17:49.296883884 +0000 UTC m=+258.141844935" Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.316557 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fk2lr" podStartSLOduration=6.341005943 podStartE2EDuration="1m40.316533458s" podCreationTimestamp="2025-12-03 12:16:09 +0000 UTC" firstStartedPulling="2025-12-03 12:16:14.817329131 +0000 UTC m=+163.662290182" lastFinishedPulling="2025-12-03 12:17:48.792856646 +0000 UTC m=+257.637817697" observedRunningTime="2025-12-03 12:17:49.310682056 +0000 UTC m=+258.155643117" watchObservedRunningTime="2025-12-03 12:17:49.316533458 +0000 UTC m=+258.161494509" Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.339673 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-879n8" podStartSLOduration=4.542363061 podStartE2EDuration="1m37.338313851s" podCreationTimestamp="2025-12-03 12:16:12 +0000 UTC" firstStartedPulling="2025-12-03 12:16:15.84826748 +0000 UTC m=+164.693228531" lastFinishedPulling="2025-12-03 12:17:48.64421827 +0000 UTC m=+257.489179321" observedRunningTime="2025-12-03 12:17:49.337283273 +0000 UTC m=+258.182244314" watchObservedRunningTime="2025-12-03 12:17:49.338313851 +0000 UTC m=+258.183274902" Dec 03 12:17:49 crc kubenswrapper[4666]: I1203 12:17:49.363399 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v8lpw" podStartSLOduration=4.5679806 podStartE2EDuration="1m37.363374655s" podCreationTimestamp="2025-12-03 12:16:12 +0000 UTC" firstStartedPulling="2025-12-03 12:16:15.858457652 +0000 UTC m=+164.703418703" lastFinishedPulling="2025-12-03 12:17:48.653851717 +0000 UTC m=+257.498812758" observedRunningTime="2025-12-03 12:17:49.359460987 +0000 UTC m=+258.204422048" watchObservedRunningTime="2025-12-03 12:17:49.363374655 +0000 UTC m=+258.208335706" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.078038 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.078128 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.298309 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.299349 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.479230 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.479294 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.803560 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.803885 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:17:50 crc kubenswrapper[4666]: I1203 12:17:50.844807 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:17:51 crc kubenswrapper[4666]: I1203 12:17:51.165312 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5fq8n" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="registry-server" probeResult="failure" output=< Dec 03 12:17:51 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:17:51 crc kubenswrapper[4666]: > Dec 03 12:17:51 crc kubenswrapper[4666]: I1203 12:17:51.358808 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-fk2lr" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="registry-server" probeResult="failure" output=< Dec 03 12:17:51 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:17:51 crc kubenswrapper[4666]: > Dec 03 12:17:51 crc kubenswrapper[4666]: I1203 12:17:51.520010 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rpltp" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="registry-server" probeResult="failure" output=< Dec 03 12:17:51 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:17:51 crc kubenswrapper[4666]: > Dec 03 12:17:52 crc kubenswrapper[4666]: I1203 12:17:52.416039 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:17:52 crc kubenswrapper[4666]: I1203 12:17:52.417368 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:17:52 crc kubenswrapper[4666]: I1203 12:17:52.861928 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:17:52 crc kubenswrapper[4666]: I1203 12:17:52.862466 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:17:53 crc kubenswrapper[4666]: I1203 12:17:53.272998 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:17:53 crc kubenswrapper[4666]: I1203 12:17:53.273341 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:17:53 crc kubenswrapper[4666]: I1203 12:17:53.460289 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bnm99" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="registry-server" probeResult="failure" output=< Dec 03 12:17:53 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:17:53 crc kubenswrapper[4666]: > Dec 03 12:17:53 crc kubenswrapper[4666]: I1203 12:17:53.481388 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:17:53 crc kubenswrapper[4666]: I1203 12:17:53.642634 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:17:53 crc kubenswrapper[4666]: I1203 12:17:53.642709 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:17:54 crc kubenswrapper[4666]: I1203 12:17:54.314065 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:17:54 crc kubenswrapper[4666]: I1203 12:17:54.341770 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8lpw" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="registry-server" probeResult="failure" output=< Dec 03 12:17:54 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:17:54 crc kubenswrapper[4666]: > Dec 03 12:17:54 crc kubenswrapper[4666]: I1203 12:17:54.683395 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zhntz" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:17:54 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:17:54 crc kubenswrapper[4666]: > Dec 03 12:17:55 crc kubenswrapper[4666]: I1203 12:17:55.340550 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-879n8"] Dec 03 12:17:56 crc kubenswrapper[4666]: I1203 12:17:56.287777 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-879n8" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="registry-server" containerID="cri-o://2179d2f6dfa7ee428ff292faa748cd8036c15949ad5e015e4b5f2d0a9ef2b4f0" gracePeriod=2 Dec 03 12:17:57 crc kubenswrapper[4666]: I1203 12:17:57.298481 4666 generic.go:334] "Generic (PLEG): container finished" podID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerID="2179d2f6dfa7ee428ff292faa748cd8036c15949ad5e015e4b5f2d0a9ef2b4f0" exitCode=0 Dec 03 12:17:57 crc kubenswrapper[4666]: I1203 12:17:57.298610 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerDied","Data":"2179d2f6dfa7ee428ff292faa748cd8036c15949ad5e015e4b5f2d0a9ef2b4f0"} Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.043206 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.157780 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-catalog-content\") pod \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.158287 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-utilities\") pod \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.158391 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6l4t\" (UniqueName: \"kubernetes.io/projected/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-kube-api-access-m6l4t\") pod \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\" (UID: \"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed\") " Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.159243 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-utilities" (OuterVolumeSpecName: "utilities") pod "b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" (UID: "b2b99ee2-2db3-40df-81b9-a789a0f2a9ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.164932 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-kube-api-access-m6l4t" (OuterVolumeSpecName: "kube-api-access-m6l4t") pod "b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" (UID: "b2b99ee2-2db3-40df-81b9-a789a0f2a9ed"). InnerVolumeSpecName "kube-api-access-m6l4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.192476 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" (UID: "b2b99ee2-2db3-40df-81b9-a789a0f2a9ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.260468 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6l4t\" (UniqueName: \"kubernetes.io/projected/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-kube-api-access-m6l4t\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.260537 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.260555 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.314145 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-879n8" event={"ID":"b2b99ee2-2db3-40df-81b9-a789a0f2a9ed","Type":"ContainerDied","Data":"d03e7397312bf7634d325712d932035d0d96b069588fe63d73bfeb0d8b8fdc0c"} Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.314231 4666 scope.go:117] "RemoveContainer" containerID="2179d2f6dfa7ee428ff292faa748cd8036c15949ad5e015e4b5f2d0a9ef2b4f0" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.314262 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-879n8" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.340866 4666 scope.go:117] "RemoveContainer" containerID="74f01736504d071e62ef7c36c133703555e6fda54ece6d9ca764bc4a5c8736b1" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.352010 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-879n8"] Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.355275 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-879n8"] Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.380600 4666 scope.go:117] "RemoveContainer" containerID="da8818fed816227dcd52937e3b12fc239c114fec457cb302be854cc77ce6eb8f" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.390351 4666 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.390672 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="extract-content" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.390700 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="extract-content" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.390720 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="extract-utilities" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.390728 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="extract-utilities" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.390742 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="registry-server" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.390750 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="registry-server" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.390767 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578a3906-05b4-456c-853d-b9949ce41520" containerName="pruner" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.390775 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="578a3906-05b4-456c-853d-b9949ce41520" containerName="pruner" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.390892 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="578a3906-05b4-456c-853d-b9949ce41520" containerName="pruner" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.390912 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" containerName="registry-server" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.391449 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.391532 4666 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.391799 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7" gracePeriod=15 Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.391875 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731" gracePeriod=15 Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.391882 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0" gracePeriod=15 Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.391931 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725" gracePeriod=15 Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.391970 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7" gracePeriod=15 Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392296 4666 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.392777 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392801 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.392818 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392827 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.392842 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392849 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.392860 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392866 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.392873 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392880 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.392893 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392899 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:17:58 crc kubenswrapper[4666]: E1203 12:17:58.392907 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.392913 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.393078 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.393201 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.393209 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.393221 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.393229 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.393235 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464297 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464379 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464419 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464448 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464473 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464524 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464545 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.464563 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566017 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566104 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566132 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566156 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566185 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566221 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566246 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566275 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566290 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566325 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566368 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566352 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566387 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566387 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566371 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:58 crc kubenswrapper[4666]: I1203 12:17:58.566505 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.330889 4666 generic.go:334] "Generic (PLEG): container finished" podID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" containerID="b97d42f758719fab20094f775153909c7166b8d1ed529a66cab39f926cf44f89" exitCode=0 Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.330984 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"efa8b052-a562-46f9-ab41-51aa3b3a0c39","Type":"ContainerDied","Data":"b97d42f758719fab20094f775153909c7166b8d1ed529a66cab39f926cf44f89"} Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.335454 4666 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.336603 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.338447 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.342720 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.344517 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731" exitCode=0 Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.344794 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7" exitCode=0 Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.345021 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725" exitCode=0 Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.345211 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0" exitCode=2 Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.344649 4666 scope.go:117] "RemoveContainer" containerID="3417f46ad004c37f2d613b7b9bcc3c1cd1fd045f6655abae2030f8ba0d103b71" Dec 03 12:17:59 crc kubenswrapper[4666]: I1203 12:17:59.431573 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b99ee2-2db3-40df-81b9-a789a0f2a9ed" path="/var/lib/kubelet/pods/b2b99ee2-2db3-40df-81b9-a789a0f2a9ed/volumes" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.129927 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.130665 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.131058 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.181438 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.182309 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.182932 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.340949 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.342416 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.343061 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.343398 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.354366 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.382727 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.383630 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.384021 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.384457 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.588474 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.589590 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.590710 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.591354 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.591852 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.656326 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.656881 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.657678 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.658188 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.658516 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.683535 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.686256 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.686753 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.687227 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.687703 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.822502 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-var-lock\") pod \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.822622 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-var-lock" (OuterVolumeSpecName: "var-lock") pod "efa8b052-a562-46f9-ab41-51aa3b3a0c39" (UID: "efa8b052-a562-46f9-ab41-51aa3b3a0c39"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.823364 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kubelet-dir\") pod \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.823615 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kube-api-access\") pod \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\" (UID: \"efa8b052-a562-46f9-ab41-51aa3b3a0c39\") " Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.823463 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "efa8b052-a562-46f9-ab41-51aa3b3a0c39" (UID: "efa8b052-a562-46f9-ab41-51aa3b3a0c39"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.824482 4666 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.825260 4666 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/efa8b052-a562-46f9-ab41-51aa3b3a0c39-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.831969 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "efa8b052-a562-46f9-ab41-51aa3b3a0c39" (UID: "efa8b052-a562-46f9-ab41-51aa3b3a0c39"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.848368 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.849785 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.850432 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.851046 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.851694 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.852427 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: I1203 12:18:00.927018 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efa8b052-a562-46f9-ab41-51aa3b3a0c39-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:00 crc kubenswrapper[4666]: E1203 12:18:00.972193 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:18:00Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:18:00Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:18:00Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T12:18:00Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: E1203 12:18:00.972679 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: E1203 12:18:00.972867 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: E1203 12:18:00.973035 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: E1203 12:18:00.973251 4666 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:00 crc kubenswrapper[4666]: E1203 12:18:00.973274 4666 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.364138 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"efa8b052-a562-46f9-ab41-51aa3b3a0c39","Type":"ContainerDied","Data":"3fdeef9dacfb1cb0b79f5e5700b9c5ace87ee364644767259f9cbb7e30d010a9"} Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.364196 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fdeef9dacfb1cb0b79f5e5700b9c5ace87ee364644767259f9cbb7e30d010a9" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.364201 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.378481 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.378921 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.379516 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.379945 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.380369 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.424944 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.425625 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.426365 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.426609 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:01 crc kubenswrapper[4666]: I1203 12:18:01.426795 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.009564 4666 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.010162 4666 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.010607 4666 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.011359 4666 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.011744 4666 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.011781 4666 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.012031 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.212610 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.483948 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.485053 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.485565 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.486165 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.487068 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.488166 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.488951 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.538402 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.539425 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.539765 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.540282 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.541030 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.541513 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: I1203 12:18:02.541915 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:02 crc kubenswrapper[4666]: E1203 12:18:02.613956 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.318287 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.319003 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.319262 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.319472 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.319834 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.320482 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.320963 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.322696 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.361242 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.362065 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.362796 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.363530 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.363848 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.364408 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.364876 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.365472 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.383363 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.384276 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7" exitCode=0 Dec 03 12:18:03 crc kubenswrapper[4666]: E1203 12:18:03.416500 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 03 12:18:03 crc kubenswrapper[4666]: E1203 12:18:03.447947 4666 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.448512 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:03 crc kubenswrapper[4666]: W1203 12:18:03.475866 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-d6ba0ba744d29c12640c27553cd651b05a87568c01ad5b7e56c2efdd77e5aa99 WatchSource:0}: Error finding container d6ba0ba744d29c12640c27553cd651b05a87568c01ad5b7e56c2efdd77e5aa99: Status 404 returned error can't find the container with id d6ba0ba744d29c12640c27553cd651b05a87568c01ad5b7e56c2efdd77e5aa99 Dec 03 12:18:03 crc kubenswrapper[4666]: E1203 12:18:03.480352 4666 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187db3c40517d154 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:18:03.47951138 +0000 UTC m=+272.324472431,LastTimestamp:2025-12-03 12:18:03.47951138 +0000 UTC m=+272.324472431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.697262 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.698719 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.699420 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.699888 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.700152 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.700603 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.701181 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.701492 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.701884 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.738911 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.739944 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.741424 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.742182 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.742731 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.743222 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.743784 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.744291 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:03 crc kubenswrapper[4666]: I1203 12:18:03.744590 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:04 crc kubenswrapper[4666]: I1203 12:18:04.398205 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d6ba0ba744d29c12640c27553cd651b05a87568c01ad5b7e56c2efdd77e5aa99"} Dec 03 12:18:05 crc kubenswrapper[4666]: E1203 12:18:05.017832 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="3.2s" Dec 03 12:18:05 crc kubenswrapper[4666]: E1203 12:18:05.175539 4666 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187db3c40517d154 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 12:18:03.47951138 +0000 UTC m=+272.324472431,LastTimestamp:2025-12-03 12:18:03.47951138 +0000 UTC m=+272.324472431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.317836 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.318591 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.319312 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.319769 4666 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.320063 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.320319 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.320571 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.320799 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.321116 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.321433 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.321696 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.406257 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa"} Dec 03 12:18:05 crc kubenswrapper[4666]: E1203 12:18:05.407559 4666 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.407790 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.408521 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.409108 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.409523 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.409707 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.410140 4666 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.410421 4666 scope.go:117] "RemoveContainer" containerID="6646aca950ff160b2d72160206852c93d024bc3650d744496b6fcbd6edd6c731" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.410425 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.410526 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.411339 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.412162 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.412562 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.426634 4666 scope.go:117] "RemoveContainer" containerID="78bce527d57289e2384e8087d754c881590836311ae9d4f27afb3cfecc21b7a7" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.443008 4666 scope.go:117] "RemoveContainer" containerID="35ae51c603cae8696b63b021f36d6ceba1efbd31f4a35d175cf8c37f6f42b725" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.457679 4666 scope.go:117] "RemoveContainer" containerID="ae60251996aee6ed9b1343410227dce22aef44815ee72a8a86f1a39eaf9cc9f0" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.472424 4666 scope.go:117] "RemoveContainer" containerID="e1221ffa281a8796d8a253a24300e926b0640e025a8f39e19f9baaa037d96ce7" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.488606 4666 scope.go:117] "RemoveContainer" containerID="074aa2e60c73ee40228fa4aefc6d9366f8d4b3f781bc3addd4219814c4d5585f" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.489581 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.489668 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.489766 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.489867 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.489879 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.489926 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.490342 4666 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.490412 4666 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.490472 4666 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.729924 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.730522 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.730841 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.731133 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.731383 4666 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.731657 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.731929 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.732190 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:05 crc kubenswrapper[4666]: I1203 12:18:05.732410 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:06 crc kubenswrapper[4666]: E1203 12:18:06.419761 4666 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:07 crc kubenswrapper[4666]: I1203 12:18:07.437183 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 12:18:08 crc kubenswrapper[4666]: E1203 12:18:08.218958 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="6.4s" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.426884 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.428407 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.428753 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.429039 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.429392 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.429720 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.430127 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:11 crc kubenswrapper[4666]: I1203 12:18:11.430565 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.422722 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.425317 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.425983 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.426465 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.426946 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.427395 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.427810 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.428199 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.428481 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.437380 4666 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.437424 4666 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:12 crc kubenswrapper[4666]: E1203 12:18:12.438127 4666 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:12 crc kubenswrapper[4666]: I1203 12:18:12.438730 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:12 crc kubenswrapper[4666]: W1203 12:18:12.569671 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-fe9d3b5e5d4646008d7de7152ae5ffec7b426bc3684b2145d522baec80e3f460 WatchSource:0}: Error finding container fe9d3b5e5d4646008d7de7152ae5ffec7b426bc3684b2145d522baec80e3f460: Status 404 returned error can't find the container with id fe9d3b5e5d4646008d7de7152ae5ffec7b426bc3684b2145d522baec80e3f460 Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.474560 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.474629 4666 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554" exitCode=1 Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.474746 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554"} Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.475490 4666 scope.go:117] "RemoveContainer" containerID="1486c5ba9a0aac0e6d0a703e48ebfd1f41a7fdbb95cdec6e302c2f4a84ac0554" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.475927 4666 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.476497 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe9d3b5e5d4646008d7de7152ae5ffec7b426bc3684b2145d522baec80e3f460"} Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.476558 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.476950 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.477186 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.477562 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.478153 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.478625 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.479500 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:13 crc kubenswrapper[4666]: I1203 12:18:13.479884 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.487741 4666 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d6922ee3434628564cebc659448fffd6a8abbbc1e4c1724a9e20e48b9b36da76" exitCode=0 Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.487812 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d6922ee3434628564cebc659448fffd6a8abbbc1e4c1724a9e20e48b9b36da76"} Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.488112 4666 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.488395 4666 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:14 crc kubenswrapper[4666]: E1203 12:18:14.488987 4666 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.489750 4666 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.489970 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.490206 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.490597 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.492611 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.493175 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.493629 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.494252 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.494700 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.507235 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.507328 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7d9bd3457d2895d5936c77df3fcb2350c19f73db1c9e965ef09f63a3206fdeba"} Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.511023 4666 status_manager.go:851] "Failed to get status for pod" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" pod="openshift-marketplace/redhat-operators-zhntz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zhntz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.511434 4666 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.511639 4666 status_manager.go:851] "Failed to get status for pod" podUID="119ceb76-1272-4185-a39b-70203557b901" pod="openshift-marketplace/certified-operators-fk2lr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-fk2lr\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.511967 4666 status_manager.go:851] "Failed to get status for pod" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" pod="openshift-marketplace/community-operators-rpltp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpltp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.512479 4666 status_manager.go:851] "Failed to get status for pod" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" pod="openshift-marketplace/redhat-operators-v8lpw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v8lpw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.512744 4666 status_manager.go:851] "Failed to get status for pod" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" pod="openshift-marketplace/community-operators-5fq8n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5fq8n\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.512962 4666 status_manager.go:851] "Failed to get status for pod" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.513161 4666 status_manager.go:851] "Failed to get status for pod" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" pod="openshift-marketplace/certified-operators-4knmd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4knmd\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: I1203 12:18:14.513336 4666 status_manager.go:851] "Failed to get status for pod" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" pod="openshift-marketplace/redhat-marketplace-bnm99" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bnm99\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 12:18:14 crc kubenswrapper[4666]: E1203 12:18:14.621166 4666 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="7s" Dec 03 12:18:15 crc kubenswrapper[4666]: I1203 12:18:15.521199 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e39f6e86221dd69ebef5dd1b443f059ad78166fb0b127e5b7d0626da1c8fe734"} Dec 03 12:18:15 crc kubenswrapper[4666]: I1203 12:18:15.521580 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6be171414e0d4c312eb3d7d551ac7c6e658dbf622c53c722f34c11881bc07043"} Dec 03 12:18:15 crc kubenswrapper[4666]: I1203 12:18:15.521594 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72dffe585b55c597bce49f52a5987de4a21ebb9b749adbbfa5eb2728613b25a1"} Dec 03 12:18:15 crc kubenswrapper[4666]: I1203 12:18:15.521605 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2fdd38142d3c8008d4ec70905da94193335ea5427e619280d18b1a918d659896"} Dec 03 12:18:16 crc kubenswrapper[4666]: I1203 12:18:16.531853 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ab10d97d0c131d1da0ef2b80e70bd7cf8d1d512c3eaf32f2ff3181faa0763ed7"} Dec 03 12:18:16 crc kubenswrapper[4666]: I1203 12:18:16.532425 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:16 crc kubenswrapper[4666]: I1203 12:18:16.532220 4666 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:16 crc kubenswrapper[4666]: I1203 12:18:16.532460 4666 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:17 crc kubenswrapper[4666]: I1203 12:18:17.439608 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:17 crc kubenswrapper[4666]: I1203 12:18:17.439684 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:17 crc kubenswrapper[4666]: I1203 12:18:17.471059 4666 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]log ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]etcd ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/priority-and-fairness-filter ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-apiextensions-informers ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-apiextensions-controllers ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/crd-informer-synced ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-system-namespaces-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 03 12:18:17 crc kubenswrapper[4666]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 03 12:18:17 crc kubenswrapper[4666]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/bootstrap-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/start-kube-aggregator-informers ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/apiservice-registration-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/apiservice-discovery-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]autoregister-completion ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/apiservice-openapi-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 03 12:18:17 crc kubenswrapper[4666]: livez check failed Dec 03 12:18:17 crc kubenswrapper[4666]: I1203 12:18:17.472916 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:18:20 crc kubenswrapper[4666]: I1203 12:18:20.250627 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:18:21 crc kubenswrapper[4666]: I1203 12:18:21.548765 4666 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:21 crc kubenswrapper[4666]: I1203 12:18:21.746058 4666 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0771c032-0451-435c-b426-2e41d4478728" Dec 03 12:18:22 crc kubenswrapper[4666]: I1203 12:18:22.569753 4666 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:22 crc kubenswrapper[4666]: I1203 12:18:22.570245 4666 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65d87524-d1ce-4e3a-88fc-229830eca10d" Dec 03 12:18:22 crc kubenswrapper[4666]: I1203 12:18:22.573734 4666 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0771c032-0451-435c-b426-2e41d4478728" Dec 03 12:18:23 crc kubenswrapper[4666]: I1203 12:18:23.337367 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:18:23 crc kubenswrapper[4666]: I1203 12:18:23.341728 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:18:23 crc kubenswrapper[4666]: I1203 12:18:23.579236 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:18:31 crc kubenswrapper[4666]: I1203 12:18:31.545478 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 12:18:31 crc kubenswrapper[4666]: I1203 12:18:31.915012 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 12:18:31 crc kubenswrapper[4666]: I1203 12:18:31.944590 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 12:18:32 crc kubenswrapper[4666]: I1203 12:18:32.062479 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 12:18:32 crc kubenswrapper[4666]: I1203 12:18:32.217967 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 12:18:32 crc kubenswrapper[4666]: I1203 12:18:32.232153 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 12:18:32 crc kubenswrapper[4666]: I1203 12:18:32.361318 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 12:18:32 crc kubenswrapper[4666]: I1203 12:18:32.676583 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 12:18:32 crc kubenswrapper[4666]: I1203 12:18:32.695644 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.353564 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.376399 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.415826 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.518696 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.611188 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.616854 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.628162 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.646720 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.780682 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 12:18:33 crc kubenswrapper[4666]: I1203 12:18:33.813541 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.122049 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.192896 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.225752 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.283954 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.284806 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.285593 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.511766 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.516878 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.518410 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.548072 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.587543 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.754368 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.770383 4666 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.774924 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.774989 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.780025 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.797751 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.797727327 podStartE2EDuration="13.797727327s" podCreationTimestamp="2025-12-03 12:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:18:34.796100716 +0000 UTC m=+303.641061787" watchObservedRunningTime="2025-12-03 12:18:34.797727327 +0000 UTC m=+303.642688408" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.804697 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.844657 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.928391 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.960881 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.974922 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.977930 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 12:18:34 crc kubenswrapper[4666]: I1203 12:18:34.999764 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.145348 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.148250 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.178824 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.258724 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.273172 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.288626 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.305129 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.322465 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.363051 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.465163 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.490215 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.563703 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.678321 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.846766 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 12:18:35 crc kubenswrapper[4666]: I1203 12:18:35.949467 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.031899 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.091420 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.140862 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.398513 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.492965 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.549675 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.551462 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.621145 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.658903 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.687382 4666 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.715913 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.795486 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.901686 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.941821 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 12:18:36 crc kubenswrapper[4666]: I1203 12:18:36.967124 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.004461 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.103773 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.125423 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.140183 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.216397 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.264795 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.266922 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.445075 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.451744 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.536410 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.592469 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.611673 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.656744 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.678803 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.689699 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 12:18:37 crc kubenswrapper[4666]: I1203 12:18:37.831496 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.122587 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.140768 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.262552 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.304329 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.376226 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.477883 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.510960 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.552566 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.572376 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.595184 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.617363 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.620613 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.646522 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.659309 4666 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.736246 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.779349 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.791080 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.800254 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.819435 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.823933 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.874882 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.912519 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.929064 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 12:18:38 crc kubenswrapper[4666]: I1203 12:18:38.960502 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.038292 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.058221 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.075780 4666 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.145016 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.180500 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.194015 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.213892 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.214781 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.248401 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.364730 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.401162 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.442584 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.470443 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.486633 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.514259 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.524316 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.617275 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.632463 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.638536 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.641001 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.710083 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.746383 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.758651 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.784167 4666 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.832976 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.848441 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.893886 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 12:18:39 crc kubenswrapper[4666]: I1203 12:18:39.978381 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.021406 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.142597 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.219889 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.286594 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.314341 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.347708 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.499795 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.505368 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.561968 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.575235 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.594476 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.659528 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.728678 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.839813 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.870456 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.931057 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 12:18:40 crc kubenswrapper[4666]: I1203 12:18:40.969064 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.061897 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.126914 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.176169 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.208846 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.237419 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.253224 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.274378 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.298864 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.588013 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.634070 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.648076 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.648966 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.674277 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.768485 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.771832 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 12:18:41 crc kubenswrapper[4666]: I1203 12:18:41.925258 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.044153 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.046988 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.124151 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.156619 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.185963 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.264357 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.278225 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.345680 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.399652 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.463667 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.463975 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.489643 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.600039 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.600158 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.600236 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.673746 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 12:18:42 crc kubenswrapper[4666]: I1203 12:18:42.799815 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.220461 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.257030 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.267452 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.273234 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.296580 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.367062 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.376955 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.412060 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.492223 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.552318 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.558710 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.662712 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.749679 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.827239 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.856889 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.883648 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 12:18:43 crc kubenswrapper[4666]: I1203 12:18:43.986922 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.027379 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.085456 4666 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.085739 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa" gracePeriod=5 Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.107209 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.125070 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.197609 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.228331 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.349067 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.548236 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.601441 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.671631 4666 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.786971 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.794139 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.903147 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 12:18:44 crc kubenswrapper[4666]: I1203 12:18:44.974226 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.034348 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.097004 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.097813 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.099905 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.146792 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.322466 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.360307 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.501792 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.650910 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.656559 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.689773 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.889530 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.920961 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 12:18:45 crc kubenswrapper[4666]: I1203 12:18:45.944927 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.013846 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.064611 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.127040 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.187808 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.331964 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.507283 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.622548 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.638665 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.802362 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 12:18:46 crc kubenswrapper[4666]: I1203 12:18:46.834038 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 12:18:47 crc kubenswrapper[4666]: I1203 12:18:47.085127 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 12:18:47 crc kubenswrapper[4666]: I1203 12:18:47.426133 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 12:18:47 crc kubenswrapper[4666]: I1203 12:18:47.587232 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.695440 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.695675 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.739884 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.739970 4666 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa" exitCode=137 Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.740029 4666 scope.go:117] "RemoveContainer" containerID="37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.740136 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.769504 4666 scope.go:117] "RemoveContainer" containerID="37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa" Dec 03 12:18:49 crc kubenswrapper[4666]: E1203 12:18:49.771418 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa\": container with ID starting with 37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa not found: ID does not exist" containerID="37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.771526 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa"} err="failed to get container status \"37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa\": rpc error: code = NotFound desc = could not find container \"37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa\": container with ID starting with 37a02ea794771241f97dc96a796d30989753b751f5d009cc9b01543bf3f7fcfa not found: ID does not exist" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849154 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849274 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849365 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849416 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849440 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849533 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849596 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849800 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.849823 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.850429 4666 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.850539 4666 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.850565 4666 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.850586 4666 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.861689 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:18:49 crc kubenswrapper[4666]: I1203 12:18:49.951478 4666 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:18:51 crc kubenswrapper[4666]: I1203 12:18:51.435738 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 12:19:00 crc kubenswrapper[4666]: I1203 12:19:00.842627 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 12:19:02 crc kubenswrapper[4666]: I1203 12:19:02.319675 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 12:19:09 crc kubenswrapper[4666]: I1203 12:19:09.857922 4666 generic.go:334] "Generic (PLEG): container finished" podID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerID="c923133b9c6778c948c059975149403cd2850660aa03f5a47c31cf0121d57dcf" exitCode=0 Dec 03 12:19:09 crc kubenswrapper[4666]: I1203 12:19:09.858007 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" event={"ID":"b87e23b4-7fdc-42d3-b940-906c38fbd4ce","Type":"ContainerDied","Data":"c923133b9c6778c948c059975149403cd2850660aa03f5a47c31cf0121d57dcf"} Dec 03 12:19:09 crc kubenswrapper[4666]: I1203 12:19:09.859022 4666 scope.go:117] "RemoveContainer" containerID="c923133b9c6778c948c059975149403cd2850660aa03f5a47c31cf0121d57dcf" Dec 03 12:19:10 crc kubenswrapper[4666]: I1203 12:19:10.867765 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" event={"ID":"b87e23b4-7fdc-42d3-b940-906c38fbd4ce","Type":"ContainerStarted","Data":"b5ff76860b3df376ef88e6810add21f42af271399bcf3ca204d1290dc703ab7c"} Dec 03 12:19:10 crc kubenswrapper[4666]: I1203 12:19:10.868488 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:19:10 crc kubenswrapper[4666]: I1203 12:19:10.870970 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:19:13 crc kubenswrapper[4666]: I1203 12:19:13.626977 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 12:19:16 crc kubenswrapper[4666]: I1203 12:19:16.040405 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 12:19:16 crc kubenswrapper[4666]: I1203 12:19:16.416916 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 12:19:18 crc kubenswrapper[4666]: I1203 12:19:18.472424 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 12:19:19 crc kubenswrapper[4666]: I1203 12:19:19.058126 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 12:19:20 crc kubenswrapper[4666]: I1203 12:19:20.453504 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 12:19:21 crc kubenswrapper[4666]: I1203 12:19:21.002658 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 12:19:27 crc kubenswrapper[4666]: I1203 12:19:27.549130 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 12:19:30 crc kubenswrapper[4666]: I1203 12:19:30.873613 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4knmd"] Dec 03 12:19:30 crc kubenswrapper[4666]: I1203 12:19:30.874512 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4knmd" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="registry-server" containerID="cri-o://c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100" gracePeriod=2 Dec 03 12:19:31 crc kubenswrapper[4666]: I1203 12:19:31.945615 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:19:31 crc kubenswrapper[4666]: I1203 12:19:31.996114 4666 generic.go:334] "Generic (PLEG): container finished" podID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerID="c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100" exitCode=0 Dec 03 12:19:31 crc kubenswrapper[4666]: I1203 12:19:31.996163 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knmd" event={"ID":"de3ac985-b5ac-4afa-9d61-1837bbe36c50","Type":"ContainerDied","Data":"c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100"} Dec 03 12:19:31 crc kubenswrapper[4666]: I1203 12:19:31.996192 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4knmd" event={"ID":"de3ac985-b5ac-4afa-9d61-1837bbe36c50","Type":"ContainerDied","Data":"c80cd7da4cda3524e8ea893e5aa1dd59041a015252053a7f79e0b67a731b5976"} Dec 03 12:19:31 crc kubenswrapper[4666]: I1203 12:19:31.996197 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4knmd" Dec 03 12:19:31 crc kubenswrapper[4666]: I1203 12:19:31.996211 4666 scope.go:117] "RemoveContainer" containerID="c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.011670 4666 scope.go:117] "RemoveContainer" containerID="238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.031386 4666 scope.go:117] "RemoveContainer" containerID="c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.048874 4666 scope.go:117] "RemoveContainer" containerID="c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100" Dec 03 12:19:32 crc kubenswrapper[4666]: E1203 12:19:32.049546 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100\": container with ID starting with c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100 not found: ID does not exist" containerID="c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.049599 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100"} err="failed to get container status \"c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100\": rpc error: code = NotFound desc = could not find container \"c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100\": container with ID starting with c82ead872f0e2be7202be8142122522f54c5c12cfabab1a9bc18a7e6dfbb4100 not found: ID does not exist" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.049629 4666 scope.go:117] "RemoveContainer" containerID="238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23" Dec 03 12:19:32 crc kubenswrapper[4666]: E1203 12:19:32.050015 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23\": container with ID starting with 238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23 not found: ID does not exist" containerID="238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.050165 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23"} err="failed to get container status \"238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23\": rpc error: code = NotFound desc = could not find container \"238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23\": container with ID starting with 238d280f0cd85cab4a80c3dc17bb511cea43b522d31adad263cda4cb65b36d23 not found: ID does not exist" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.050303 4666 scope.go:117] "RemoveContainer" containerID="c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040" Dec 03 12:19:32 crc kubenswrapper[4666]: E1203 12:19:32.050957 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040\": container with ID starting with c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040 not found: ID does not exist" containerID="c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.051005 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040"} err="failed to get container status \"c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040\": rpc error: code = NotFound desc = could not find container \"c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040\": container with ID starting with c15de8cc6628bdd3ef216fc0c5b117ab90d4b35c5f7fa292750b70f30fa62040 not found: ID does not exist" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.072864 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-utilities\") pod \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.072961 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qccw\" (UniqueName: \"kubernetes.io/projected/de3ac985-b5ac-4afa-9d61-1837bbe36c50-kube-api-access-4qccw\") pod \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.073054 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-catalog-content\") pod \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\" (UID: \"de3ac985-b5ac-4afa-9d61-1837bbe36c50\") " Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.074127 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-utilities" (OuterVolumeSpecName: "utilities") pod "de3ac985-b5ac-4afa-9d61-1837bbe36c50" (UID: "de3ac985-b5ac-4afa-9d61-1837bbe36c50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.079694 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3ac985-b5ac-4afa-9d61-1837bbe36c50-kube-api-access-4qccw" (OuterVolumeSpecName: "kube-api-access-4qccw") pod "de3ac985-b5ac-4afa-9d61-1837bbe36c50" (UID: "de3ac985-b5ac-4afa-9d61-1837bbe36c50"). InnerVolumeSpecName "kube-api-access-4qccw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.122586 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de3ac985-b5ac-4afa-9d61-1837bbe36c50" (UID: "de3ac985-b5ac-4afa-9d61-1837bbe36c50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.174572 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.175138 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qccw\" (UniqueName: \"kubernetes.io/projected/de3ac985-b5ac-4afa-9d61-1837bbe36c50-kube-api-access-4qccw\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.175272 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3ac985-b5ac-4afa-9d61-1837bbe36c50-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.326731 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4knmd"] Dec 03 12:19:32 crc kubenswrapper[4666]: I1203 12:19:32.330124 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4knmd"] Dec 03 12:19:33 crc kubenswrapper[4666]: I1203 12:19:33.072455 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhntz"] Dec 03 12:19:33 crc kubenswrapper[4666]: I1203 12:19:33.073239 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zhntz" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="registry-server" containerID="cri-o://ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701" gracePeriod=2 Dec 03 12:19:33 crc kubenswrapper[4666]: I1203 12:19:33.282926 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rpltp"] Dec 03 12:19:33 crc kubenswrapper[4666]: I1203 12:19:33.283451 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rpltp" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="registry-server" containerID="cri-o://227ca731416a9a911095b3e13d60ef2dadf3c80107dfe1b163695ff006439bd8" gracePeriod=2 Dec 03 12:19:33 crc kubenswrapper[4666]: I1203 12:19:33.432501 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" path="/var/lib/kubelet/pods/de3ac985-b5ac-4afa-9d61-1837bbe36c50/volumes" Dec 03 12:19:33 crc kubenswrapper[4666]: E1203 12:19:33.643018 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701 is running failed: container process not found" containerID="ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:19:33 crc kubenswrapper[4666]: E1203 12:19:33.644081 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701 is running failed: container process not found" containerID="ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:19:33 crc kubenswrapper[4666]: E1203 12:19:33.644529 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701 is running failed: container process not found" containerID="ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:19:33 crc kubenswrapper[4666]: E1203 12:19:33.644572 4666 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zhntz" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="registry-server" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.016652 4666 generic.go:334] "Generic (PLEG): container finished" podID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerID="227ca731416a9a911095b3e13d60ef2dadf3c80107dfe1b163695ff006439bd8" exitCode=0 Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.016700 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerDied","Data":"227ca731416a9a911095b3e13d60ef2dadf3c80107dfe1b163695ff006439bd8"} Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.019845 4666 generic.go:334] "Generic (PLEG): container finished" podID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerID="ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701" exitCode=0 Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.019889 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhntz" event={"ID":"f0c6678b-ef6b-4681-a187-cf69e14eff7e","Type":"ContainerDied","Data":"ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701"} Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.620496 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.794030 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.810431 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qpgp\" (UniqueName: \"kubernetes.io/projected/f0c6678b-ef6b-4681-a187-cf69e14eff7e-kube-api-access-9qpgp\") pod \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.810546 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-utilities\") pod \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.810662 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-catalog-content\") pod \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\" (UID: \"f0c6678b-ef6b-4681-a187-cf69e14eff7e\") " Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.811485 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-utilities" (OuterVolumeSpecName: "utilities") pod "f0c6678b-ef6b-4681-a187-cf69e14eff7e" (UID: "f0c6678b-ef6b-4681-a187-cf69e14eff7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.821913 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c6678b-ef6b-4681-a187-cf69e14eff7e-kube-api-access-9qpgp" (OuterVolumeSpecName: "kube-api-access-9qpgp") pod "f0c6678b-ef6b-4681-a187-cf69e14eff7e" (UID: "f0c6678b-ef6b-4681-a187-cf69e14eff7e"). InnerVolumeSpecName "kube-api-access-9qpgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.912007 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-utilities\") pod \"130907ce-450a-4a73-92c6-aae2f3b2f850\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.912067 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ldn8\" (UniqueName: \"kubernetes.io/projected/130907ce-450a-4a73-92c6-aae2f3b2f850-kube-api-access-6ldn8\") pod \"130907ce-450a-4a73-92c6-aae2f3b2f850\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.912888 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-utilities" (OuterVolumeSpecName: "utilities") pod "130907ce-450a-4a73-92c6-aae2f3b2f850" (UID: "130907ce-450a-4a73-92c6-aae2f3b2f850"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.913076 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-catalog-content\") pod \"130907ce-450a-4a73-92c6-aae2f3b2f850\" (UID: \"130907ce-450a-4a73-92c6-aae2f3b2f850\") " Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.913636 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qpgp\" (UniqueName: \"kubernetes.io/projected/f0c6678b-ef6b-4681-a187-cf69e14eff7e-kube-api-access-9qpgp\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.913663 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.913674 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.915180 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130907ce-450a-4a73-92c6-aae2f3b2f850-kube-api-access-6ldn8" (OuterVolumeSpecName: "kube-api-access-6ldn8") pod "130907ce-450a-4a73-92c6-aae2f3b2f850" (UID: "130907ce-450a-4a73-92c6-aae2f3b2f850"). InnerVolumeSpecName "kube-api-access-6ldn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.938396 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0c6678b-ef6b-4681-a187-cf69e14eff7e" (UID: "f0c6678b-ef6b-4681-a187-cf69e14eff7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:34 crc kubenswrapper[4666]: I1203 12:19:34.984675 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "130907ce-450a-4a73-92c6-aae2f3b2f850" (UID: "130907ce-450a-4a73-92c6-aae2f3b2f850"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.015619 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130907ce-450a-4a73-92c6-aae2f3b2f850-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.015681 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ldn8\" (UniqueName: \"kubernetes.io/projected/130907ce-450a-4a73-92c6-aae2f3b2f850-kube-api-access-6ldn8\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.015698 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c6678b-ef6b-4681-a187-cf69e14eff7e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.029538 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhntz" event={"ID":"f0c6678b-ef6b-4681-a187-cf69e14eff7e","Type":"ContainerDied","Data":"77d9a2f091371e41798dbe075d324a8edbf486e208d4e75ee98d1045b8b7d12d"} Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.030013 4666 scope.go:117] "RemoveContainer" containerID="ebec7ab59cc0e9461407ce2d8de8f6ea70cfc8a035d6bf980cc2b481d810f701" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.029559 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhntz" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.034066 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpltp" event={"ID":"130907ce-450a-4a73-92c6-aae2f3b2f850","Type":"ContainerDied","Data":"8628159afe7c9c8b1bd9c978d55856d8de5267c690050166ee036cfe00f5e132"} Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.034201 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpltp" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.049204 4666 scope.go:117] "RemoveContainer" containerID="60fd7da638e8fdd9c19575a6c2bd8f8b64494fc1f870ddae123981281a61491a" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.067330 4666 scope.go:117] "RemoveContainer" containerID="8f708bb2bfc6a24a7c027ca32d3e42386078079d9b3f233088fe9a1a259227d1" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.083353 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rpltp"] Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.084785 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rpltp"] Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.102705 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhntz"] Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.106728 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zhntz"] Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.107226 4666 scope.go:117] "RemoveContainer" containerID="227ca731416a9a911095b3e13d60ef2dadf3c80107dfe1b163695ff006439bd8" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.122560 4666 scope.go:117] "RemoveContainer" containerID="a5e25060d27ca2122d15a9f0a8c5c3e072d425099da71422b26121797869da53" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.137241 4666 scope.go:117] "RemoveContainer" containerID="0bc9e0b852fc187defefd7c50e273748fbb864b7de935401b62a95e3150c0bc2" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.432803 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" path="/var/lib/kubelet/pods/130907ce-450a-4a73-92c6-aae2f3b2f850/volumes" Dec 03 12:19:35 crc kubenswrapper[4666]: I1203 12:19:35.434645 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" path="/var/lib/kubelet/pods/f0c6678b-ef6b-4681-a187-cf69e14eff7e/volumes" Dec 03 12:19:39 crc kubenswrapper[4666]: I1203 12:19:39.866680 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:19:39 crc kubenswrapper[4666]: I1203 12:19:39.867214 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:20:09 crc kubenswrapper[4666]: I1203 12:20:09.866274 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:20:09 crc kubenswrapper[4666]: I1203 12:20:09.867036 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:20:16 crc kubenswrapper[4666]: I1203 12:20:16.099897 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5czcz"] Dec 03 12:20:39 crc kubenswrapper[4666]: I1203 12:20:39.866712 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:20:39 crc kubenswrapper[4666]: I1203 12:20:39.867525 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:20:39 crc kubenswrapper[4666]: I1203 12:20:39.867588 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:20:39 crc kubenswrapper[4666]: I1203 12:20:39.868466 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f902d3820b1cdaa69a222508ac420a330a388c9eac4b12d3a57b516661f8fab0"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:20:39 crc kubenswrapper[4666]: I1203 12:20:39.868565 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://f902d3820b1cdaa69a222508ac420a330a388c9eac4b12d3a57b516661f8fab0" gracePeriod=600 Dec 03 12:20:40 crc kubenswrapper[4666]: I1203 12:20:40.460511 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="f902d3820b1cdaa69a222508ac420a330a388c9eac4b12d3a57b516661f8fab0" exitCode=0 Dec 03 12:20:40 crc kubenswrapper[4666]: I1203 12:20:40.460770 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"f902d3820b1cdaa69a222508ac420a330a388c9eac4b12d3a57b516661f8fab0"} Dec 03 12:20:40 crc kubenswrapper[4666]: I1203 12:20:40.461446 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"42f1fdc9e0109a6fea38adeca948249efd34dc5629bf48ee1ef90f8243a581ac"} Dec 03 12:20:40 crc kubenswrapper[4666]: I1203 12:20:40.461553 4666 scope.go:117] "RemoveContainer" containerID="f2a2755bcd34d26fa6963d458a4d258594c8b2e00d62e2657ea58e862a9d9e08" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.129854 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" podUID="2cddd4e1-3283-4f65-a1bb-68d449471280" containerName="oauth-openshift" containerID="cri-o://7f54431144412239a93c31c2db9dd4141cc55da84d65b6d9688a7dd7cd426b36" gracePeriod=15 Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.474320 4666 generic.go:334] "Generic (PLEG): container finished" podID="2cddd4e1-3283-4f65-a1bb-68d449471280" containerID="7f54431144412239a93c31c2db9dd4141cc55da84d65b6d9688a7dd7cd426b36" exitCode=0 Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.474383 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" event={"ID":"2cddd4e1-3283-4f65-a1bb-68d449471280","Type":"ContainerDied","Data":"7f54431144412239a93c31c2db9dd4141cc55da84d65b6d9688a7dd7cd426b36"} Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.474440 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" event={"ID":"2cddd4e1-3283-4f65-a1bb-68d449471280","Type":"ContainerDied","Data":"5e74f7f4dabce637c9b42895ef182cad1483991b64443d5d2b16f5b3a6141803"} Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.474507 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e74f7f4dabce637c9b42895ef182cad1483991b64443d5d2b16f5b3a6141803" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.507395 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.543463 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-745588b54f-9rl2r"] Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.546255 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="extract-content" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.546369 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="extract-content" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.546447 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.546504 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.546571 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.546635 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.546701 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="extract-content" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.546765 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="extract-content" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.546829 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="extract-utilities" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.546881 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="extract-utilities" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.546942 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="extract-content" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.546999 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="extract-content" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.547056 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.547131 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.547208 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="extract-utilities" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.547265 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="extract-utilities" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.547326 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.547381 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.547447 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cddd4e1-3283-4f65-a1bb-68d449471280" containerName="oauth-openshift" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.547504 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cddd4e1-3283-4f65-a1bb-68d449471280" containerName="oauth-openshift" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.547642 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="extract-utilities" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.547700 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="extract-utilities" Dec 03 12:20:41 crc kubenswrapper[4666]: E1203 12:20:41.547761 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" containerName="installer" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.547863 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" containerName="installer" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.548018 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3ac985-b5ac-4afa-9d61-1837bbe36c50" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.548077 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa8b052-a562-46f9-ab41-51aa3b3a0c39" containerName="installer" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.548159 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c6678b-ef6b-4681-a187-cf69e14eff7e" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.548216 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cddd4e1-3283-4f65-a1bb-68d449471280" containerName="oauth-openshift" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.548281 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.548350 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="130907ce-450a-4a73-92c6-aae2f3b2f850" containerName="registry-server" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.548955 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.563932 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-trusted-ca-bundle\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.563999 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-policies\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564039 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-provider-selection\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564078 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-dir\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564117 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-error\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564152 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-session\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564174 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-cliconfig\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564193 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-service-ca\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564210 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-login\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564242 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-router-certs\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564263 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx7zf\" (UniqueName: \"kubernetes.io/projected/2cddd4e1-3283-4f65-a1bb-68d449471280-kube-api-access-tx7zf\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564303 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-serving-cert\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564336 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-ocp-branding-template\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.564375 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-idp-0-file-data\") pod \"2cddd4e1-3283-4f65-a1bb-68d449471280\" (UID: \"2cddd4e1-3283-4f65-a1bb-68d449471280\") " Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.565308 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.565762 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.567031 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.568862 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.571653 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.575366 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.578748 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.580337 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.580825 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.583797 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cddd4e1-3283-4f65-a1bb-68d449471280-kube-api-access-tx7zf" (OuterVolumeSpecName: "kube-api-access-tx7zf") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "kube-api-access-tx7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.583885 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-745588b54f-9rl2r"] Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.590350 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.590814 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.594900 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.595465 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2cddd4e1-3283-4f65-a1bb-68d449471280" (UID: "2cddd4e1-3283-4f65-a1bb-68d449471280"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.665823 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-router-certs\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.665903 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-audit-policies\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.665939 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666172 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-serving-cert\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666330 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6da2da71-4ef1-41d2-aac4-00da7855d489-audit-dir\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666375 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-login\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666411 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666452 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666501 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-error\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666539 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-session\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666601 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-service-ca\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666655 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666718 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-cliconfig\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666811 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnq2\" (UniqueName: \"kubernetes.io/projected/6da2da71-4ef1-41d2-aac4-00da7855d489-kube-api-access-bbnq2\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666892 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666910 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666926 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666940 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666954 4666 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666970 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.666985 4666 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2cddd4e1-3283-4f65-a1bb-68d449471280-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.667000 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.667014 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.667028 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.667041 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.667054 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.667069 4666 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2cddd4e1-3283-4f65-a1bb-68d449471280-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.667109 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx7zf\" (UniqueName: \"kubernetes.io/projected/2cddd4e1-3283-4f65-a1bb-68d449471280-kube-api-access-tx7zf\") on node \"crc\" DevicePath \"\"" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768434 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-cliconfig\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768515 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbnq2\" (UniqueName: \"kubernetes.io/projected/6da2da71-4ef1-41d2-aac4-00da7855d489-kube-api-access-bbnq2\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768545 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-router-certs\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768565 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-audit-policies\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768587 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768608 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-serving-cert\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768637 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6da2da71-4ef1-41d2-aac4-00da7855d489-audit-dir\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768660 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-login\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768683 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768703 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768723 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-error\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768743 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-session\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768766 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-service-ca\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.768787 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.769431 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6da2da71-4ef1-41d2-aac4-00da7855d489-audit-dir\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.769971 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-cliconfig\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.770023 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-audit-policies\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.770501 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.770786 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-service-ca\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.772907 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-router-certs\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.773535 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-login\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.773592 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.773939 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.775491 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-serving-cert\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.776366 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-system-session\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.776488 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.779504 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6da2da71-4ef1-41d2-aac4-00da7855d489-v4-0-config-user-template-error\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.787705 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbnq2\" (UniqueName: \"kubernetes.io/projected/6da2da71-4ef1-41d2-aac4-00da7855d489-kube-api-access-bbnq2\") pod \"oauth-openshift-745588b54f-9rl2r\" (UID: \"6da2da71-4ef1-41d2-aac4-00da7855d489\") " pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:41 crc kubenswrapper[4666]: I1203 12:20:41.915632 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.146499 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-745588b54f-9rl2r"] Dec 03 12:20:42 crc kubenswrapper[4666]: W1203 12:20:42.155480 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da2da71_4ef1_41d2_aac4_00da7855d489.slice/crio-e186d97ef5a5f45c4627fef87a8cd4fa6d713f564c7756af099ca5375e6e09ca WatchSource:0}: Error finding container e186d97ef5a5f45c4627fef87a8cd4fa6d713f564c7756af099ca5375e6e09ca: Status 404 returned error can't find the container with id e186d97ef5a5f45c4627fef87a8cd4fa6d713f564c7756af099ca5375e6e09ca Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.481647 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5czcz" Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.483236 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" event={"ID":"6da2da71-4ef1-41d2-aac4-00da7855d489","Type":"ContainerStarted","Data":"fee1484fb55bd6a5437867ab47c6d344afe52eefac0f25264122a427918a2551"} Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.483284 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" event={"ID":"6da2da71-4ef1-41d2-aac4-00da7855d489","Type":"ContainerStarted","Data":"e186d97ef5a5f45c4627fef87a8cd4fa6d713f564c7756af099ca5375e6e09ca"} Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.483545 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.508844 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" podStartSLOduration=26.508820058 podStartE2EDuration="26.508820058s" podCreationTimestamp="2025-12-03 12:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:20:42.507224435 +0000 UTC m=+431.352185486" watchObservedRunningTime="2025-12-03 12:20:42.508820058 +0000 UTC m=+431.353781109" Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.527383 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5czcz"] Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.533293 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5czcz"] Dec 03 12:20:42 crc kubenswrapper[4666]: I1203 12:20:42.863706 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-745588b54f-9rl2r" Dec 03 12:20:43 crc kubenswrapper[4666]: I1203 12:20:43.432774 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cddd4e1-3283-4f65-a1bb-68d449471280" path="/var/lib/kubelet/pods/2cddd4e1-3283-4f65-a1bb-68d449471280/volumes" Dec 03 12:22:31 crc kubenswrapper[4666]: I1203 12:22:31.602227 4666 scope.go:117] "RemoveContainer" containerID="7f54431144412239a93c31c2db9dd4141cc55da84d65b6d9688a7dd7cd426b36" Dec 03 12:23:09 crc kubenswrapper[4666]: I1203 12:23:09.866214 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:23:09 crc kubenswrapper[4666]: I1203 12:23:09.867428 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:23:39 crc kubenswrapper[4666]: I1203 12:23:39.865887 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:23:39 crc kubenswrapper[4666]: I1203 12:23:39.867053 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:24:09 crc kubenswrapper[4666]: I1203 12:24:09.866171 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:24:09 crc kubenswrapper[4666]: I1203 12:24:09.866929 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:24:09 crc kubenswrapper[4666]: I1203 12:24:09.866992 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:24:09 crc kubenswrapper[4666]: I1203 12:24:09.867734 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42f1fdc9e0109a6fea38adeca948249efd34dc5629bf48ee1ef90f8243a581ac"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:24:09 crc kubenswrapper[4666]: I1203 12:24:09.867795 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://42f1fdc9e0109a6fea38adeca948249efd34dc5629bf48ee1ef90f8243a581ac" gracePeriod=600 Dec 03 12:24:10 crc kubenswrapper[4666]: I1203 12:24:10.920801 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="42f1fdc9e0109a6fea38adeca948249efd34dc5629bf48ee1ef90f8243a581ac" exitCode=0 Dec 03 12:24:10 crc kubenswrapper[4666]: I1203 12:24:10.920814 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"42f1fdc9e0109a6fea38adeca948249efd34dc5629bf48ee1ef90f8243a581ac"} Dec 03 12:24:10 crc kubenswrapper[4666]: I1203 12:24:10.921647 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"fa81dab43a4d71f3d67c6c619619b55815978bcd28afe3f39fd74796b1fa5fe6"} Dec 03 12:24:10 crc kubenswrapper[4666]: I1203 12:24:10.921674 4666 scope.go:117] "RemoveContainer" containerID="f902d3820b1cdaa69a222508ac420a330a388c9eac4b12d3a57b516661f8fab0" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.530308 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m2ngc"] Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.532046 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.559695 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m2ngc"] Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.657991 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/392ab76b-05dd-4a4a-b20d-00db53520c1e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.658105 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.658127 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-bound-sa-token\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.658207 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzvkv\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-kube-api-access-hzvkv\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.658240 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-registry-tls\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.658288 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/392ab76b-05dd-4a4a-b20d-00db53520c1e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.658323 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/392ab76b-05dd-4a4a-b20d-00db53520c1e-trusted-ca\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.658376 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/392ab76b-05dd-4a4a-b20d-00db53520c1e-registry-certificates\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.683958 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.759745 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-bound-sa-token\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.759838 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzvkv\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-kube-api-access-hzvkv\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.759867 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-registry-tls\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.759907 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/392ab76b-05dd-4a4a-b20d-00db53520c1e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.759943 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/392ab76b-05dd-4a4a-b20d-00db53520c1e-trusted-ca\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.759973 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/392ab76b-05dd-4a4a-b20d-00db53520c1e-registry-certificates\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.760005 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/392ab76b-05dd-4a4a-b20d-00db53520c1e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.760947 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/392ab76b-05dd-4a4a-b20d-00db53520c1e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.762035 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/392ab76b-05dd-4a4a-b20d-00db53520c1e-trusted-ca\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.762132 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/392ab76b-05dd-4a4a-b20d-00db53520c1e-registry-certificates\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.767799 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/392ab76b-05dd-4a4a-b20d-00db53520c1e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.771925 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-registry-tls\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.785292 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzvkv\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-kube-api-access-hzvkv\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.785607 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/392ab76b-05dd-4a4a-b20d-00db53520c1e-bound-sa-token\") pod \"image-registry-66df7c8f76-m2ngc\" (UID: \"392ab76b-05dd-4a4a-b20d-00db53520c1e\") " pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:13 crc kubenswrapper[4666]: I1203 12:25:13.852608 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:14 crc kubenswrapper[4666]: I1203 12:25:14.291935 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m2ngc"] Dec 03 12:25:14 crc kubenswrapper[4666]: I1203 12:25:14.407413 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" event={"ID":"392ab76b-05dd-4a4a-b20d-00db53520c1e","Type":"ContainerStarted","Data":"4b9d94104b14ffafbe390b91bdd20065f157a46c1bc770f5a2bb31541f3b812d"} Dec 03 12:25:15 crc kubenswrapper[4666]: I1203 12:25:15.415332 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" event={"ID":"392ab76b-05dd-4a4a-b20d-00db53520c1e","Type":"ContainerStarted","Data":"beb6b75a539422183f13d42570901275de15ef1e31668556b28dc6974133dff2"} Dec 03 12:25:15 crc kubenswrapper[4666]: I1203 12:25:15.415804 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:15 crc kubenswrapper[4666]: I1203 12:25:15.451131 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" podStartSLOduration=2.451100076 podStartE2EDuration="2.451100076s" podCreationTimestamp="2025-12-03 12:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:25:15.443318235 +0000 UTC m=+704.288279316" watchObservedRunningTime="2025-12-03 12:25:15.451100076 +0000 UTC m=+704.296061137" Dec 03 12:25:33 crc kubenswrapper[4666]: I1203 12:25:33.864836 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-m2ngc" Dec 03 12:25:33 crc kubenswrapper[4666]: I1203 12:25:33.941187 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt6lp"] Dec 03 12:25:58 crc kubenswrapper[4666]: I1203 12:25:58.991921 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" podUID="09abca61-c4c9-4f0c-beb9-468eea7e3f95" containerName="registry" containerID="cri-o://2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383" gracePeriod=30 Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.362528 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.505881 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09abca61-c4c9-4f0c-beb9-468eea7e3f95-ca-trust-extracted\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.506020 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-certificates\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.506080 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-tls\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.506170 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-bound-sa-token\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.506217 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-trusted-ca\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.506252 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09abca61-c4c9-4f0c-beb9-468eea7e3f95-installation-pull-secrets\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.506446 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.506503 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqmtv\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-kube-api-access-wqmtv\") pod \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\" (UID: \"09abca61-c4c9-4f0c-beb9-468eea7e3f95\") " Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.507312 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.508952 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.517279 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.517940 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.518679 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-kube-api-access-wqmtv" (OuterVolumeSpecName: "kube-api-access-wqmtv") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "kube-api-access-wqmtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.522764 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09abca61-c4c9-4f0c-beb9-468eea7e3f95-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.525806 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09abca61-c4c9-4f0c-beb9-468eea7e3f95-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.528343 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "09abca61-c4c9-4f0c-beb9-468eea7e3f95" (UID: "09abca61-c4c9-4f0c-beb9-468eea7e3f95"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.608202 4666 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.608256 4666 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.608282 4666 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.608300 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09abca61-c4c9-4f0c-beb9-468eea7e3f95-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.608318 4666 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09abca61-c4c9-4f0c-beb9-468eea7e3f95-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.608339 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqmtv\" (UniqueName: \"kubernetes.io/projected/09abca61-c4c9-4f0c-beb9-468eea7e3f95-kube-api-access-wqmtv\") on node \"crc\" DevicePath \"\"" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.608424 4666 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09abca61-c4c9-4f0c-beb9-468eea7e3f95-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.715317 4666 generic.go:334] "Generic (PLEG): container finished" podID="09abca61-c4c9-4f0c-beb9-468eea7e3f95" containerID="2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383" exitCode=0 Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.715367 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" event={"ID":"09abca61-c4c9-4f0c-beb9-468eea7e3f95","Type":"ContainerDied","Data":"2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383"} Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.715406 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" event={"ID":"09abca61-c4c9-4f0c-beb9-468eea7e3f95","Type":"ContainerDied","Data":"8d235e6a3a60d451a15cdd15fb91f72166bb89d52c81a84cd716e3622da061f4"} Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.715434 4666 scope.go:117] "RemoveContainer" containerID="2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.715482 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt6lp" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.744428 4666 scope.go:117] "RemoveContainer" containerID="2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383" Dec 03 12:25:59 crc kubenswrapper[4666]: E1203 12:25:59.745265 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383\": container with ID starting with 2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383 not found: ID does not exist" containerID="2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.745307 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383"} err="failed to get container status \"2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383\": rpc error: code = NotFound desc = could not find container \"2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383\": container with ID starting with 2651fc63de3a85a8d37fbcf6ab7ea6fd2474bd91d75e0d880043415348f4c383 not found: ID does not exist" Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.769534 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt6lp"] Dec 03 12:25:59 crc kubenswrapper[4666]: I1203 12:25:59.776103 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt6lp"] Dec 03 12:26:01 crc kubenswrapper[4666]: I1203 12:26:01.437752 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09abca61-c4c9-4f0c-beb9-468eea7e3f95" path="/var/lib/kubelet/pods/09abca61-c4c9-4f0c-beb9-468eea7e3f95/volumes" Dec 03 12:26:17 crc kubenswrapper[4666]: I1203 12:26:17.085854 4666 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 12:26:39 crc kubenswrapper[4666]: I1203 12:26:39.866418 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:26:39 crc kubenswrapper[4666]: I1203 12:26:39.867252 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:27:09 crc kubenswrapper[4666]: I1203 12:27:09.866772 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:27:09 crc kubenswrapper[4666]: I1203 12:27:09.867499 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:27:39 crc kubenswrapper[4666]: I1203 12:27:39.866141 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:27:39 crc kubenswrapper[4666]: I1203 12:27:39.866750 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:27:39 crc kubenswrapper[4666]: I1203 12:27:39.866803 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:27:39 crc kubenswrapper[4666]: I1203 12:27:39.867556 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa81dab43a4d71f3d67c6c619619b55815978bcd28afe3f39fd74796b1fa5fe6"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:27:39 crc kubenswrapper[4666]: I1203 12:27:39.867630 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://fa81dab43a4d71f3d67c6c619619b55815978bcd28afe3f39fd74796b1fa5fe6" gracePeriod=600 Dec 03 12:27:40 crc kubenswrapper[4666]: I1203 12:27:40.706816 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="fa81dab43a4d71f3d67c6c619619b55815978bcd28afe3f39fd74796b1fa5fe6" exitCode=0 Dec 03 12:27:40 crc kubenswrapper[4666]: I1203 12:27:40.706890 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"fa81dab43a4d71f3d67c6c619619b55815978bcd28afe3f39fd74796b1fa5fe6"} Dec 03 12:27:40 crc kubenswrapper[4666]: I1203 12:27:40.707492 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"42bacd4c418c06abe4ddafebca16fbfc263995c2be08b6db1014fc4108c63522"} Dec 03 12:27:40 crc kubenswrapper[4666]: I1203 12:27:40.707513 4666 scope.go:117] "RemoveContainer" containerID="42f1fdc9e0109a6fea38adeca948249efd34dc5629bf48ee1ef90f8243a581ac" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.466456 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fk2lr"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.467548 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fk2lr" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="registry-server" containerID="cri-o://c2c00e29009e1547db66580d47b73ebf5a3d61855e583e3381fd1476351413c7" gracePeriod=30 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.473548 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5fq8n"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.474123 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5fq8n" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="registry-server" containerID="cri-o://ab62ebe12477f6221e1bf443749213a8bc85e7829527e505948c1c1983273f77" gracePeriod=30 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.486254 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgdb2"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.486487 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" containerID="cri-o://b5ff76860b3df376ef88e6810add21f42af271399bcf3ca204d1290dc703ab7c" gracePeriod=30 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.497486 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnm99"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.497808 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bnm99" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="registry-server" containerID="cri-o://948b2e0773007541791722ae090532dbe4db79eb4a21f7e164b7f3bd146eba23" gracePeriod=30 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.516362 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vrds"] Dec 03 12:27:48 crc kubenswrapper[4666]: E1203 12:27:48.516652 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09abca61-c4c9-4f0c-beb9-468eea7e3f95" containerName="registry" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.516667 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="09abca61-c4c9-4f0c-beb9-468eea7e3f95" containerName="registry" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.516788 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="09abca61-c4c9-4f0c-beb9-468eea7e3f95" containerName="registry" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.517248 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.520786 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8lpw"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.521226 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v8lpw" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="registry-server" containerID="cri-o://627a6258aab26ecd1ab5d7a76e6f947f8ec1bd67bbdc6bb181038151d583354f" gracePeriod=30 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.531858 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vrds"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.610035 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6gc5\" (UniqueName: \"kubernetes.io/projected/d829bfc6-3cf6-4b30-a501-1586386d7698-kube-api-access-j6gc5\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.610638 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d829bfc6-3cf6-4b30-a501-1586386d7698-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.610668 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d829bfc6-3cf6-4b30-a501-1586386d7698-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.681816 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6qjsp"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.683022 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.708594 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qjsp"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.713802 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl79l\" (UniqueName: \"kubernetes.io/projected/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-kube-api-access-wl79l\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.713881 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-utilities\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.713921 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6gc5\" (UniqueName: \"kubernetes.io/projected/d829bfc6-3cf6-4b30-a501-1586386d7698-kube-api-access-j6gc5\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.714247 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-catalog-content\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.714333 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d829bfc6-3cf6-4b30-a501-1586386d7698-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.714355 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d829bfc6-3cf6-4b30-a501-1586386d7698-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.715945 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d829bfc6-3cf6-4b30-a501-1586386d7698-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.724519 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d829bfc6-3cf6-4b30-a501-1586386d7698-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.740044 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6gc5\" (UniqueName: \"kubernetes.io/projected/d829bfc6-3cf6-4b30-a501-1586386d7698-kube-api-access-j6gc5\") pod \"marketplace-operator-79b997595-2vrds\" (UID: \"d829bfc6-3cf6-4b30-a501-1586386d7698\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.764296 4666 generic.go:334] "Generic (PLEG): container finished" podID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerID="627a6258aab26ecd1ab5d7a76e6f947f8ec1bd67bbdc6bb181038151d583354f" exitCode=0 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.764382 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerDied","Data":"627a6258aab26ecd1ab5d7a76e6f947f8ec1bd67bbdc6bb181038151d583354f"} Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.767386 4666 generic.go:334] "Generic (PLEG): container finished" podID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerID="ab62ebe12477f6221e1bf443749213a8bc85e7829527e505948c1c1983273f77" exitCode=0 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.767462 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fq8n" event={"ID":"ed8c4c35-0630-4f7c-aa9a-008a349c70db","Type":"ContainerDied","Data":"ab62ebe12477f6221e1bf443749213a8bc85e7829527e505948c1c1983273f77"} Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.769758 4666 generic.go:334] "Generic (PLEG): container finished" podID="119ceb76-1272-4185-a39b-70203557b901" containerID="c2c00e29009e1547db66580d47b73ebf5a3d61855e583e3381fd1476351413c7" exitCode=0 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.769825 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk2lr" event={"ID":"119ceb76-1272-4185-a39b-70203557b901","Type":"ContainerDied","Data":"c2c00e29009e1547db66580d47b73ebf5a3d61855e583e3381fd1476351413c7"} Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.771614 4666 generic.go:334] "Generic (PLEG): container finished" podID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerID="b5ff76860b3df376ef88e6810add21f42af271399bcf3ca204d1290dc703ab7c" exitCode=0 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.771710 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" event={"ID":"b87e23b4-7fdc-42d3-b940-906c38fbd4ce","Type":"ContainerDied","Data":"b5ff76860b3df376ef88e6810add21f42af271399bcf3ca204d1290dc703ab7c"} Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.771772 4666 scope.go:117] "RemoveContainer" containerID="c923133b9c6778c948c059975149403cd2850660aa03f5a47c31cf0121d57dcf" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.778292 4666 generic.go:334] "Generic (PLEG): container finished" podID="36c1a423-51ec-4cce-bb65-f397809c6848" containerID="948b2e0773007541791722ae090532dbe4db79eb4a21f7e164b7f3bd146eba23" exitCode=0 Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.779367 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerDied","Data":"948b2e0773007541791722ae090532dbe4db79eb4a21f7e164b7f3bd146eba23"} Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.816750 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl79l\" (UniqueName: \"kubernetes.io/projected/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-kube-api-access-wl79l\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.816855 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-utilities\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.816998 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-catalog-content\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.820169 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-utilities\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.820194 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-catalog-content\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.837466 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl79l\" (UniqueName: \"kubernetes.io/projected/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-kube-api-access-wl79l\") pod \"certified-operators-6qjsp\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.843034 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.890160 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rbj9s"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.892986 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.895108 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.904891 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.909534 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbj9s"] Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.922061 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-utilities\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.922141 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-catalog-content\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.922173 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsg8h\" (UniqueName: \"kubernetes.io/projected/46838e1f-61c7-4f5c-ad1f-240091d5df80-kube-api-access-vsg8h\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.924944 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:27:48 crc kubenswrapper[4666]: I1203 12:27:48.948312 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.023569 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt5cx\" (UniqueName: \"kubernetes.io/projected/119ceb76-1272-4185-a39b-70203557b901-kube-api-access-tt5cx\") pod \"119ceb76-1272-4185-a39b-70203557b901\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.023934 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-utilities\") pod \"119ceb76-1272-4185-a39b-70203557b901\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024058 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8lh9\" (UniqueName: \"kubernetes.io/projected/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-kube-api-access-b8lh9\") pod \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024196 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-operator-metrics\") pod \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024286 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-catalog-content\") pod \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024390 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-utilities\") pod \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024495 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-trusted-ca\") pod \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\" (UID: \"b87e23b4-7fdc-42d3-b940-906c38fbd4ce\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024587 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-catalog-content\") pod \"119ceb76-1272-4185-a39b-70203557b901\" (UID: \"119ceb76-1272-4185-a39b-70203557b901\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024679 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9srb7\" (UniqueName: \"kubernetes.io/projected/ed8c4c35-0630-4f7c-aa9a-008a349c70db-kube-api-access-9srb7\") pod \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\" (UID: \"ed8c4c35-0630-4f7c-aa9a-008a349c70db\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.024976 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-utilities\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.025097 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-catalog-content\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.025183 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsg8h\" (UniqueName: \"kubernetes.io/projected/46838e1f-61c7-4f5c-ad1f-240091d5df80-kube-api-access-vsg8h\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.026333 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.029666 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-utilities\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.030737 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-utilities" (OuterVolumeSpecName: "utilities") pod "119ceb76-1272-4185-a39b-70203557b901" (UID: "119ceb76-1272-4185-a39b-70203557b901"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.032796 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-catalog-content\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.033344 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119ceb76-1272-4185-a39b-70203557b901-kube-api-access-tt5cx" (OuterVolumeSpecName: "kube-api-access-tt5cx") pod "119ceb76-1272-4185-a39b-70203557b901" (UID: "119ceb76-1272-4185-a39b-70203557b901"). InnerVolumeSpecName "kube-api-access-tt5cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.034164 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b87e23b4-7fdc-42d3-b940-906c38fbd4ce" (UID: "b87e23b4-7fdc-42d3-b940-906c38fbd4ce"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.039193 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-utilities" (OuterVolumeSpecName: "utilities") pod "ed8c4c35-0630-4f7c-aa9a-008a349c70db" (UID: "ed8c4c35-0630-4f7c-aa9a-008a349c70db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.039780 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b87e23b4-7fdc-42d3-b940-906c38fbd4ce" (UID: "b87e23b4-7fdc-42d3-b940-906c38fbd4ce"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.052584 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-kube-api-access-b8lh9" (OuterVolumeSpecName: "kube-api-access-b8lh9") pod "b87e23b4-7fdc-42d3-b940-906c38fbd4ce" (UID: "b87e23b4-7fdc-42d3-b940-906c38fbd4ce"). InnerVolumeSpecName "kube-api-access-b8lh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.052622 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsg8h\" (UniqueName: \"kubernetes.io/projected/46838e1f-61c7-4f5c-ad1f-240091d5df80-kube-api-access-vsg8h\") pod \"community-operators-rbj9s\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.053300 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8c4c35-0630-4f7c-aa9a-008a349c70db-kube-api-access-9srb7" (OuterVolumeSpecName: "kube-api-access-9srb7") pod "ed8c4c35-0630-4f7c-aa9a-008a349c70db" (UID: "ed8c4c35-0630-4f7c-aa9a-008a349c70db"). InnerVolumeSpecName "kube-api-access-9srb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.119573 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.127807 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8b6q\" (UniqueName: \"kubernetes.io/projected/36c1a423-51ec-4cce-bb65-f397809c6848-kube-api-access-h8b6q\") pod \"36c1a423-51ec-4cce-bb65-f397809c6848\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.127957 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-utilities\") pod \"36c1a423-51ec-4cce-bb65-f397809c6848\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128018 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-catalog-content\") pod \"36c1a423-51ec-4cce-bb65-f397809c6848\" (UID: \"36c1a423-51ec-4cce-bb65-f397809c6848\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128424 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128439 4666 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128456 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9srb7\" (UniqueName: \"kubernetes.io/projected/ed8c4c35-0630-4f7c-aa9a-008a349c70db-kube-api-access-9srb7\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128469 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt5cx\" (UniqueName: \"kubernetes.io/projected/119ceb76-1272-4185-a39b-70203557b901-kube-api-access-tt5cx\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128481 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128493 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8lh9\" (UniqueName: \"kubernetes.io/projected/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-kube-api-access-b8lh9\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.128503 4666 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87e23b4-7fdc-42d3-b940-906c38fbd4ce-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.133370 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-utilities" (OuterVolumeSpecName: "utilities") pod "36c1a423-51ec-4cce-bb65-f397809c6848" (UID: "36c1a423-51ec-4cce-bb65-f397809c6848"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.156951 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36c1a423-51ec-4cce-bb65-f397809c6848" (UID: "36c1a423-51ec-4cce-bb65-f397809c6848"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.158291 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed8c4c35-0630-4f7c-aa9a-008a349c70db" (UID: "ed8c4c35-0630-4f7c-aa9a-008a349c70db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.163162 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "119ceb76-1272-4185-a39b-70203557b901" (UID: "119ceb76-1272-4185-a39b-70203557b901"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.166461 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c1a423-51ec-4cce-bb65-f397809c6848-kube-api-access-h8b6q" (OuterVolumeSpecName: "kube-api-access-h8b6q") pod "36c1a423-51ec-4cce-bb65-f397809c6848" (UID: "36c1a423-51ec-4cce-bb65-f397809c6848"). InnerVolumeSpecName "kube-api-access-h8b6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.229614 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp99d\" (UniqueName: \"kubernetes.io/projected/614c2c74-3d6f-4930-8c3e-a1bd11714e03-kube-api-access-xp99d\") pod \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.232839 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/614c2c74-3d6f-4930-8c3e-a1bd11714e03-kube-api-access-xp99d" (OuterVolumeSpecName: "kube-api-access-xp99d") pod "614c2c74-3d6f-4930-8c3e-a1bd11714e03" (UID: "614c2c74-3d6f-4930-8c3e-a1bd11714e03"). InnerVolumeSpecName "kube-api-access-xp99d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.233875 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-catalog-content\") pod \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.234024 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-utilities\") pod \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\" (UID: \"614c2c74-3d6f-4930-8c3e-a1bd11714e03\") " Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235263 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp99d\" (UniqueName: \"kubernetes.io/projected/614c2c74-3d6f-4930-8c3e-a1bd11714e03-kube-api-access-xp99d\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235295 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235308 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8c4c35-0630-4f7c-aa9a-008a349c70db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235320 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c1a423-51ec-4cce-bb65-f397809c6848-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235331 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119ceb76-1272-4185-a39b-70203557b901-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235342 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8b6q\" (UniqueName: \"kubernetes.io/projected/36c1a423-51ec-4cce-bb65-f397809c6848-kube-api-access-h8b6q\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235359 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.235826 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-utilities" (OuterVolumeSpecName: "utilities") pod "614c2c74-3d6f-4930-8c3e-a1bd11714e03" (UID: "614c2c74-3d6f-4930-8c3e-a1bd11714e03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.263262 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vrds"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.313274 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qjsp"] Dec 03 12:27:49 crc kubenswrapper[4666]: W1203 12:27:49.328769 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c14f5eb_2fd9_4ecb_8e0e_7e1d8fddf847.slice/crio-77316fe0f74aefed2c83fd63159f070b9dd04ab1a6ca440870d4fc67bf38b2c3 WatchSource:0}: Error finding container 77316fe0f74aefed2c83fd63159f070b9dd04ab1a6ca440870d4fc67bf38b2c3: Status 404 returned error can't find the container with id 77316fe0f74aefed2c83fd63159f070b9dd04ab1a6ca440870d4fc67bf38b2c3 Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.337134 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.362889 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "614c2c74-3d6f-4930-8c3e-a1bd11714e03" (UID: "614c2c74-3d6f-4930-8c3e-a1bd11714e03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.439625 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/614c2c74-3d6f-4930-8c3e-a1bd11714e03-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.481729 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbj9s"] Dec 03 12:27:49 crc kubenswrapper[4666]: W1203 12:27:49.526385 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46838e1f_61c7_4f5c_ad1f_240091d5df80.slice/crio-31b6d2921df49e31b6a022c2149af1859349f15284542a4caa32ea27ea34ec93 WatchSource:0}: Error finding container 31b6d2921df49e31b6a022c2149af1859349f15284542a4caa32ea27ea34ec93: Status 404 returned error can't find the container with id 31b6d2921df49e31b6a022c2149af1859349f15284542a4caa32ea27ea34ec93 Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.788994 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fq8n" event={"ID":"ed8c4c35-0630-4f7c-aa9a-008a349c70db","Type":"ContainerDied","Data":"fe25a6399d799764b2638e12b9bd00bc19cc6178374c812c75c73c9034847859"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.789064 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fq8n" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.789512 4666 scope.go:117] "RemoveContainer" containerID="ab62ebe12477f6221e1bf443749213a8bc85e7829527e505948c1c1983273f77" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.793478 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk2lr" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.793469 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk2lr" event={"ID":"119ceb76-1272-4185-a39b-70203557b901","Type":"ContainerDied","Data":"d4c34be7af90fb25ba9552fa7eb2d36573893a36a6ea661239d42da20c0d2450"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.795628 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" event={"ID":"d829bfc6-3cf6-4b30-a501-1586386d7698","Type":"ContainerStarted","Data":"aa5a0101cfd677253faa0e376832ad4483b02e6842dc12a5fe09893122e555f2"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.795671 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" event={"ID":"d829bfc6-3cf6-4b30-a501-1586386d7698","Type":"ContainerStarted","Data":"1689f201bfd9f08433e424b2d2e2f26a0ed5cdb902b30fc65846bb581e5b7fa1"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.795859 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.799353 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" event={"ID":"b87e23b4-7fdc-42d3-b940-906c38fbd4ce","Type":"ContainerDied","Data":"90f995b216ab1a93eeaa3ca2cd345167f4f3a6ee64b306e4dbe0aae10e79c944"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.799445 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgdb2" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.801249 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.806308 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnm99" event={"ID":"36c1a423-51ec-4cce-bb65-f397809c6848","Type":"ContainerDied","Data":"eb89825d608f732e0321032267f35f3e642a80c0c08e1a378879dc6d3610bcb8"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.806372 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnm99" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.810016 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8lpw" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.810472 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8lpw" event={"ID":"614c2c74-3d6f-4930-8c3e-a1bd11714e03","Type":"ContainerDied","Data":"85ed9793b1d0dbdc50edb6e7a0ed798c3fb9eead848c31c1b05d34caf45a0bd6"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.811915 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerID="a08fb3d19095f4470080867022bad4f5d6779422d78bd84b5a70e0dd5a6aa012" exitCode=0 Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.812003 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qjsp" event={"ID":"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847","Type":"ContainerDied","Data":"a08fb3d19095f4470080867022bad4f5d6779422d78bd84b5a70e0dd5a6aa012"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.812039 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qjsp" event={"ID":"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847","Type":"ContainerStarted","Data":"77316fe0f74aefed2c83fd63159f070b9dd04ab1a6ca440870d4fc67bf38b2c3"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.814145 4666 scope.go:117] "RemoveContainer" containerID="76f6fae45936d11b2759b1f4bce57ffa0088334a6890072ff5fefd34b674395c" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.814590 4666 generic.go:334] "Generic (PLEG): container finished" podID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerID="593d1884d7db4c54b685b02928568029e3a91f88d3b5120d64259f326de58be9" exitCode=0 Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.814637 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbj9s" event={"ID":"46838e1f-61c7-4f5c-ad1f-240091d5df80","Type":"ContainerDied","Data":"593d1884d7db4c54b685b02928568029e3a91f88d3b5120d64259f326de58be9"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.814668 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbj9s" event={"ID":"46838e1f-61c7-4f5c-ad1f-240091d5df80","Type":"ContainerStarted","Data":"31b6d2921df49e31b6a022c2149af1859349f15284542a4caa32ea27ea34ec93"} Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.815473 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.837858 4666 scope.go:117] "RemoveContainer" containerID="30518dee452786ff5ed41ca036a3dfbf8ef227439a34736a0eb5525dd5c5d5db" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.840150 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2vrds" podStartSLOduration=1.840127488 podStartE2EDuration="1.840127488s" podCreationTimestamp="2025-12-03 12:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:27:49.83799477 +0000 UTC m=+858.682955821" watchObservedRunningTime="2025-12-03 12:27:49.840127488 +0000 UTC m=+858.685088539" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.860167 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5fq8n"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.871150 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5fq8n"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.873413 4666 scope.go:117] "RemoveContainer" containerID="c2c00e29009e1547db66580d47b73ebf5a3d61855e583e3381fd1476351413c7" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.881579 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fk2lr"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.890146 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fk2lr"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.896490 4666 scope.go:117] "RemoveContainer" containerID="fe1f8b5e1aeea194af67442b29796e529ffa2866b234430c67084ecdc19b9821" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.931725 4666 scope.go:117] "RemoveContainer" containerID="e6dd4f0cb038dc6184fbabe24938f3f05307fadecb38e53342863f6be0c9dd44" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.947023 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8lpw"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.950236 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v8lpw"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.951276 4666 scope.go:117] "RemoveContainer" containerID="b5ff76860b3df376ef88e6810add21f42af271399bcf3ca204d1290dc703ab7c" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.963311 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgdb2"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.967460 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgdb2"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.978198 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnm99"] Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.978471 4666 scope.go:117] "RemoveContainer" containerID="948b2e0773007541791722ae090532dbe4db79eb4a21f7e164b7f3bd146eba23" Dec 03 12:27:49 crc kubenswrapper[4666]: I1203 12:27:49.984639 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnm99"] Dec 03 12:27:50 crc kubenswrapper[4666]: I1203 12:27:50.000948 4666 scope.go:117] "RemoveContainer" containerID="1911aef9191e22eff76b2ed92cdeb31853bec6526d8dfdc763f4f2b012173dd1" Dec 03 12:27:50 crc kubenswrapper[4666]: I1203 12:27:50.016820 4666 scope.go:117] "RemoveContainer" containerID="90c704b31ebbf588ec9d750c43d338c32d887bac674d21c203291030671672b0" Dec 03 12:27:50 crc kubenswrapper[4666]: I1203 12:27:50.031347 4666 scope.go:117] "RemoveContainer" containerID="627a6258aab26ecd1ab5d7a76e6f947f8ec1bd67bbdc6bb181038151d583354f" Dec 03 12:27:50 crc kubenswrapper[4666]: I1203 12:27:50.048661 4666 scope.go:117] "RemoveContainer" containerID="4b8b1e97495270cf47fdd50174ab08dfb392811ea41649b90fb59b985dbec8b5" Dec 03 12:27:50 crc kubenswrapper[4666]: I1203 12:27:50.065467 4666 scope.go:117] "RemoveContainer" containerID="a65ebf7b759cee83f7d93e781fb37794cf30a6fc537292c5b786b88599005488" Dec 03 12:27:50 crc kubenswrapper[4666]: I1203 12:27:50.828266 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qjsp" event={"ID":"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847","Type":"ContainerStarted","Data":"26d7033e9087ef8afc5f705ea17a0662246723093cb4c9e8b067567d6b6ba5ce"} Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.084487 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2fsqg"] Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085028 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085044 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085060 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085066 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085076 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085138 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085148 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085153 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085162 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085168 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085175 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085182 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085192 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085198 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085205 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085211 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085221 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085227 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085236 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085243 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="extract-utilities" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085253 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085259 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="extract-content" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085286 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085293 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085304 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085309 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085399 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085412 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="119ceb76-1272-4185-a39b-70203557b901" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085419 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085428 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085438 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" containerName="registry-server" Dec 03 12:27:51 crc kubenswrapper[4666]: E1203 12:27:51.085551 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085558 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.085638 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" containerName="marketplace-operator" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.086516 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.092439 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.094170 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fsqg"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.265551 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8229727b-4723-46f6-919d-1eb721caefd1-utilities\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.265641 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29kth\" (UniqueName: \"kubernetes.io/projected/8229727b-4723-46f6-919d-1eb721caefd1-kube-api-access-29kth\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.265709 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8229727b-4723-46f6-919d-1eb721caefd1-catalog-content\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.284427 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vwb2h"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.288411 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.292603 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.300049 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwb2h"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.367401 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8229727b-4723-46f6-919d-1eb721caefd1-utilities\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.367515 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29kth\" (UniqueName: \"kubernetes.io/projected/8229727b-4723-46f6-919d-1eb721caefd1-kube-api-access-29kth\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.367747 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8229727b-4723-46f6-919d-1eb721caefd1-catalog-content\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.368443 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8229727b-4723-46f6-919d-1eb721caefd1-catalog-content\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.368559 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8229727b-4723-46f6-919d-1eb721caefd1-utilities\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.393201 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29kth\" (UniqueName: \"kubernetes.io/projected/8229727b-4723-46f6-919d-1eb721caefd1-kube-api-access-29kth\") pod \"redhat-marketplace-2fsqg\" (UID: \"8229727b-4723-46f6-919d-1eb721caefd1\") " pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.413691 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.438202 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119ceb76-1272-4185-a39b-70203557b901" path="/var/lib/kubelet/pods/119ceb76-1272-4185-a39b-70203557b901/volumes" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.438920 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c1a423-51ec-4cce-bb65-f397809c6848" path="/var/lib/kubelet/pods/36c1a423-51ec-4cce-bb65-f397809c6848/volumes" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.439580 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="614c2c74-3d6f-4930-8c3e-a1bd11714e03" path="/var/lib/kubelet/pods/614c2c74-3d6f-4930-8c3e-a1bd11714e03/volumes" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.440681 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87e23b4-7fdc-42d3-b940-906c38fbd4ce" path="/var/lib/kubelet/pods/b87e23b4-7fdc-42d3-b940-906c38fbd4ce/volumes" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.441165 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8c4c35-0630-4f7c-aa9a-008a349c70db" path="/var/lib/kubelet/pods/ed8c4c35-0630-4f7c-aa9a-008a349c70db/volumes" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.472923 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82ff889b-3fab-481d-b9a0-36991fb87e8f-catalog-content\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.473157 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82ff889b-3fab-481d-b9a0-36991fb87e8f-utilities\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.473303 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr8s8\" (UniqueName: \"kubernetes.io/projected/82ff889b-3fab-481d-b9a0-36991fb87e8f-kube-api-access-wr8s8\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.482837 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7889c"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.487314 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.491201 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7889c"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.575076 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82ff889b-3fab-481d-b9a0-36991fb87e8f-catalog-content\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.575580 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82ff889b-3fab-481d-b9a0-36991fb87e8f-utilities\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.575625 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln46p\" (UniqueName: \"kubernetes.io/projected/ece082ea-3119-43c4-9d36-152e62629542-kube-api-access-ln46p\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.575641 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-catalog-content\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.575680 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-utilities\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.575740 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr8s8\" (UniqueName: \"kubernetes.io/projected/82ff889b-3fab-481d-b9a0-36991fb87e8f-kube-api-access-wr8s8\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.577384 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82ff889b-3fab-481d-b9a0-36991fb87e8f-catalog-content\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.577884 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82ff889b-3fab-481d-b9a0-36991fb87e8f-utilities\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.601933 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr8s8\" (UniqueName: \"kubernetes.io/projected/82ff889b-3fab-481d-b9a0-36991fb87e8f-kube-api-access-wr8s8\") pod \"redhat-operators-vwb2h\" (UID: \"82ff889b-3fab-481d-b9a0-36991fb87e8f\") " pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.633554 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fsqg"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.669725 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.678228 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln46p\" (UniqueName: \"kubernetes.io/projected/ece082ea-3119-43c4-9d36-152e62629542-kube-api-access-ln46p\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.678311 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-catalog-content\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.678369 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-utilities\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.679124 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-utilities\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.679461 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-catalog-content\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.685123 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r7r4t"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.686340 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.692876 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r7r4t"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.709064 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln46p\" (UniqueName: \"kubernetes.io/projected/ece082ea-3119-43c4-9d36-152e62629542-kube-api-access-ln46p\") pod \"redhat-marketplace-7889c\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.779453 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-catalog-content\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.779507 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-utilities\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.779538 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xd8\" (UniqueName: \"kubernetes.io/projected/c9d16e83-a84e-4114-845d-a4fdf158100e-kube-api-access-p5xd8\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.815451 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.873905 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerID="26d7033e9087ef8afc5f705ea17a0662246723093cb4c9e8b067567d6b6ba5ce" exitCode=0 Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.873972 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qjsp" event={"ID":"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847","Type":"ContainerDied","Data":"26d7033e9087ef8afc5f705ea17a0662246723093cb4c9e8b067567d6b6ba5ce"} Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.880967 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-catalog-content\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.881010 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-utilities\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.881032 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xd8\" (UniqueName: \"kubernetes.io/projected/c9d16e83-a84e-4114-845d-a4fdf158100e-kube-api-access-p5xd8\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.881632 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-catalog-content\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.881735 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-utilities\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.881842 4666 generic.go:334] "Generic (PLEG): container finished" podID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerID="a07a8ea57064b9f59e72c358631ec4566f2da59fed83e908f0c375ca3781775a" exitCode=0 Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.881927 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbj9s" event={"ID":"46838e1f-61c7-4f5c-ad1f-240091d5df80","Type":"ContainerDied","Data":"a07a8ea57064b9f59e72c358631ec4566f2da59fed83e908f0c375ca3781775a"} Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.884237 4666 generic.go:334] "Generic (PLEG): container finished" podID="8229727b-4723-46f6-919d-1eb721caefd1" containerID="cb23a99ee40d1d0faf8c7ce5d521580296d938a24c324ccddc563b8afd39c687" exitCode=0 Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.884320 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fsqg" event={"ID":"8229727b-4723-46f6-919d-1eb721caefd1","Type":"ContainerDied","Data":"cb23a99ee40d1d0faf8c7ce5d521580296d938a24c324ccddc563b8afd39c687"} Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.884369 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fsqg" event={"ID":"8229727b-4723-46f6-919d-1eb721caefd1","Type":"ContainerStarted","Data":"d9fc22bc8b32b73ee4990f1a93c16b03905386163f5baab706eca2451336065a"} Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.884952 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwb2h"] Dec 03 12:27:51 crc kubenswrapper[4666]: I1203 12:27:51.906271 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xd8\" (UniqueName: \"kubernetes.io/projected/c9d16e83-a84e-4114-845d-a4fdf158100e-kube-api-access-p5xd8\") pod \"redhat-operators-r7r4t\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.035787 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7889c"] Dec 03 12:27:52 crc kubenswrapper[4666]: W1203 12:27:52.043834 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podece082ea_3119_43c4_9d36_152e62629542.slice/crio-cffc922f40211b84a3f404b465c14bd6f29dd9b43cda464b2bf7a4a71260daab WatchSource:0}: Error finding container cffc922f40211b84a3f404b465c14bd6f29dd9b43cda464b2bf7a4a71260daab: Status 404 returned error can't find the container with id cffc922f40211b84a3f404b465c14bd6f29dd9b43cda464b2bf7a4a71260daab Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.077808 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.274935 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r7r4t"] Dec 03 12:27:52 crc kubenswrapper[4666]: W1203 12:27:52.343278 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d16e83_a84e_4114_845d_a4fdf158100e.slice/crio-a213228e6968b1b985444f4c5a4927bf3a149a7df528729c396ccee4f1271b4a WatchSource:0}: Error finding container a213228e6968b1b985444f4c5a4927bf3a149a7df528729c396ccee4f1271b4a: Status 404 returned error can't find the container with id a213228e6968b1b985444f4c5a4927bf3a149a7df528729c396ccee4f1271b4a Dec 03 12:27:52 crc kubenswrapper[4666]: E1203 12:27:52.668859 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d16e83_a84e_4114_845d_a4fdf158100e.slice/crio-9362d9eb8e6505523ff31ef63d3f467e0cbdf74f3fb4da0ad4209e11c7f41a48.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.890873 4666 generic.go:334] "Generic (PLEG): container finished" podID="ece082ea-3119-43c4-9d36-152e62629542" containerID="00ffecfca5d56b31140c0fe4a1f00c6ff0a26f084945783f97549665617eec3d" exitCode=0 Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.890937 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7889c" event={"ID":"ece082ea-3119-43c4-9d36-152e62629542","Type":"ContainerDied","Data":"00ffecfca5d56b31140c0fe4a1f00c6ff0a26f084945783f97549665617eec3d"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.891385 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7889c" event={"ID":"ece082ea-3119-43c4-9d36-152e62629542","Type":"ContainerStarted","Data":"cffc922f40211b84a3f404b465c14bd6f29dd9b43cda464b2bf7a4a71260daab"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.894319 4666 generic.go:334] "Generic (PLEG): container finished" podID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerID="9362d9eb8e6505523ff31ef63d3f467e0cbdf74f3fb4da0ad4209e11c7f41a48" exitCode=0 Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.894389 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7r4t" event={"ID":"c9d16e83-a84e-4114-845d-a4fdf158100e","Type":"ContainerDied","Data":"9362d9eb8e6505523ff31ef63d3f467e0cbdf74f3fb4da0ad4209e11c7f41a48"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.894491 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7r4t" event={"ID":"c9d16e83-a84e-4114-845d-a4fdf158100e","Type":"ContainerStarted","Data":"a213228e6968b1b985444f4c5a4927bf3a149a7df528729c396ccee4f1271b4a"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.897416 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fsqg" event={"ID":"8229727b-4723-46f6-919d-1eb721caefd1","Type":"ContainerStarted","Data":"efe0ddc8b64cbb7d15af4fb3ff48655c0689ca8105413a4bedc045a120070572"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.900128 4666 generic.go:334] "Generic (PLEG): container finished" podID="82ff889b-3fab-481d-b9a0-36991fb87e8f" containerID="e8def0e485a044dfffb8b4ed351a00441704bf371c693032e4154c9404bcc043" exitCode=0 Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.900201 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwb2h" event={"ID":"82ff889b-3fab-481d-b9a0-36991fb87e8f","Type":"ContainerDied","Data":"e8def0e485a044dfffb8b4ed351a00441704bf371c693032e4154c9404bcc043"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.900272 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwb2h" event={"ID":"82ff889b-3fab-481d-b9a0-36991fb87e8f","Type":"ContainerStarted","Data":"acaabc65c9150d1db9adb709142d434c285d96c3c8bc77d86fa225de4a60ce6b"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.904946 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qjsp" event={"ID":"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847","Type":"ContainerStarted","Data":"71dfff68d624b7706c9b0d03f34c827c265a6ef90b44b42de99dc19f7c12bd8b"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.909634 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbj9s" event={"ID":"46838e1f-61c7-4f5c-ad1f-240091d5df80","Type":"ContainerStarted","Data":"8311e29477794f1755340535271c552d403ce65a9d8532930b041331dd155f87"} Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.957950 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rbj9s" podStartSLOduration=2.450536758 podStartE2EDuration="4.95792504s" podCreationTimestamp="2025-12-03 12:27:48 +0000 UTC" firstStartedPulling="2025-12-03 12:27:49.817179496 +0000 UTC m=+858.662140547" lastFinishedPulling="2025-12-03 12:27:52.324567778 +0000 UTC m=+861.169528829" observedRunningTime="2025-12-03 12:27:52.953557712 +0000 UTC m=+861.798518813" watchObservedRunningTime="2025-12-03 12:27:52.95792504 +0000 UTC m=+861.802886101" Dec 03 12:27:52 crc kubenswrapper[4666]: I1203 12:27:52.989576 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6qjsp" podStartSLOduration=2.498639552 podStartE2EDuration="4.989551288s" podCreationTimestamp="2025-12-03 12:27:48 +0000 UTC" firstStartedPulling="2025-12-03 12:27:49.815257353 +0000 UTC m=+858.660218394" lastFinishedPulling="2025-12-03 12:27:52.306169089 +0000 UTC m=+861.151130130" observedRunningTime="2025-12-03 12:27:52.985404565 +0000 UTC m=+861.830365626" watchObservedRunningTime="2025-12-03 12:27:52.989551288 +0000 UTC m=+861.834512359" Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.880602 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lqvpl"] Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.881694 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.894598 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqvpl"] Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.929650 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7889c" event={"ID":"ece082ea-3119-43c4-9d36-152e62629542","Type":"ContainerStarted","Data":"cfe8dca8bfe8deb848519c36d10bc9dcb940acd7bb8bd19290bec85caa55af6d"} Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.931224 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7r4t" event={"ID":"c9d16e83-a84e-4114-845d-a4fdf158100e","Type":"ContainerStarted","Data":"8af69852f3b6ce0b4d277d22f2b0fec62c40d6020c6d89179ce74c54c44e90e8"} Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.933587 4666 generic.go:334] "Generic (PLEG): container finished" podID="8229727b-4723-46f6-919d-1eb721caefd1" containerID="efe0ddc8b64cbb7d15af4fb3ff48655c0689ca8105413a4bedc045a120070572" exitCode=0 Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.933620 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fsqg" event={"ID":"8229727b-4723-46f6-919d-1eb721caefd1","Type":"ContainerDied","Data":"efe0ddc8b64cbb7d15af4fb3ff48655c0689ca8105413a4bedc045a120070572"} Dec 03 12:27:53 crc kubenswrapper[4666]: I1203 12:27:53.935718 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwb2h" event={"ID":"82ff889b-3fab-481d-b9a0-36991fb87e8f","Type":"ContainerStarted","Data":"7fcfd24cd07f71a51702d8c2c3d6c06cf195ea28d6cdf4b3deb1fc42dcbfbf34"} Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.014510 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74eb5845-66cb-4c54-a2c7-53f59a686e0d-utilities\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.015009 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nr7h\" (UniqueName: \"kubernetes.io/projected/74eb5845-66cb-4c54-a2c7-53f59a686e0d-kube-api-access-5nr7h\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.015147 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74eb5845-66cb-4c54-a2c7-53f59a686e0d-catalog-content\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.081387 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5v6kg"] Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.082747 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.101498 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5v6kg"] Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.116672 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nr7h\" (UniqueName: \"kubernetes.io/projected/74eb5845-66cb-4c54-a2c7-53f59a686e0d-kube-api-access-5nr7h\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.116762 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74eb5845-66cb-4c54-a2c7-53f59a686e0d-catalog-content\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.116797 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74eb5845-66cb-4c54-a2c7-53f59a686e0d-utilities\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.117273 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74eb5845-66cb-4c54-a2c7-53f59a686e0d-utilities\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.117303 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74eb5845-66cb-4c54-a2c7-53f59a686e0d-catalog-content\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.137754 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nr7h\" (UniqueName: \"kubernetes.io/projected/74eb5845-66cb-4c54-a2c7-53f59a686e0d-kube-api-access-5nr7h\") pod \"certified-operators-lqvpl\" (UID: \"74eb5845-66cb-4c54-a2c7-53f59a686e0d\") " pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.218015 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e039eec-3ce6-475e-9e89-8dc64fd04701-utilities\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.218071 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e039eec-3ce6-475e-9e89-8dc64fd04701-catalog-content\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.218126 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvfjp\" (UniqueName: \"kubernetes.io/projected/4e039eec-3ce6-475e-9e89-8dc64fd04701-kube-api-access-zvfjp\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.233833 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.321108 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e039eec-3ce6-475e-9e89-8dc64fd04701-utilities\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.321175 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e039eec-3ce6-475e-9e89-8dc64fd04701-catalog-content\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.321235 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvfjp\" (UniqueName: \"kubernetes.io/projected/4e039eec-3ce6-475e-9e89-8dc64fd04701-kube-api-access-zvfjp\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.321738 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e039eec-3ce6-475e-9e89-8dc64fd04701-utilities\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.322022 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e039eec-3ce6-475e-9e89-8dc64fd04701-catalog-content\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.342332 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvfjp\" (UniqueName: \"kubernetes.io/projected/4e039eec-3ce6-475e-9e89-8dc64fd04701-kube-api-access-zvfjp\") pod \"community-operators-5v6kg\" (UID: \"4e039eec-3ce6-475e-9e89-8dc64fd04701\") " pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.400649 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.606196 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5v6kg"] Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.667721 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqvpl"] Dec 03 12:27:54 crc kubenswrapper[4666]: W1203 12:27:54.677120 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74eb5845_66cb_4c54_a2c7_53f59a686e0d.slice/crio-d6b9f0b396975c7d68fcd18de342a020599af65e27dd26de502b09bdcdc00a0c WatchSource:0}: Error finding container d6b9f0b396975c7d68fcd18de342a020599af65e27dd26de502b09bdcdc00a0c: Status 404 returned error can't find the container with id d6b9f0b396975c7d68fcd18de342a020599af65e27dd26de502b09bdcdc00a0c Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.944074 4666 generic.go:334] "Generic (PLEG): container finished" podID="ece082ea-3119-43c4-9d36-152e62629542" containerID="cfe8dca8bfe8deb848519c36d10bc9dcb940acd7bb8bd19290bec85caa55af6d" exitCode=0 Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.944131 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7889c" event={"ID":"ece082ea-3119-43c4-9d36-152e62629542","Type":"ContainerDied","Data":"cfe8dca8bfe8deb848519c36d10bc9dcb940acd7bb8bd19290bec85caa55af6d"} Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.946352 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v6kg" event={"ID":"4e039eec-3ce6-475e-9e89-8dc64fd04701","Type":"ContainerStarted","Data":"0cddd2a2665442a47323c12aeb238480af5e6ae9fb3c5cdc4476ab4249c74e99"} Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.952539 4666 generic.go:334] "Generic (PLEG): container finished" podID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerID="8af69852f3b6ce0b4d277d22f2b0fec62c40d6020c6d89179ce74c54c44e90e8" exitCode=0 Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.952589 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7r4t" event={"ID":"c9d16e83-a84e-4114-845d-a4fdf158100e","Type":"ContainerDied","Data":"8af69852f3b6ce0b4d277d22f2b0fec62c40d6020c6d89179ce74c54c44e90e8"} Dec 03 12:27:54 crc kubenswrapper[4666]: I1203 12:27:54.956252 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqvpl" event={"ID":"74eb5845-66cb-4c54-a2c7-53f59a686e0d","Type":"ContainerStarted","Data":"d6b9f0b396975c7d68fcd18de342a020599af65e27dd26de502b09bdcdc00a0c"} Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.967232 4666 generic.go:334] "Generic (PLEG): container finished" podID="74eb5845-66cb-4c54-a2c7-53f59a686e0d" containerID="5fca1835f70991d642a82ced518379a1c4d086f274cce3621ff42d20adc80ac9" exitCode=0 Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.967340 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqvpl" event={"ID":"74eb5845-66cb-4c54-a2c7-53f59a686e0d","Type":"ContainerDied","Data":"5fca1835f70991d642a82ced518379a1c4d086f274cce3621ff42d20adc80ac9"} Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.972721 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7889c" event={"ID":"ece082ea-3119-43c4-9d36-152e62629542","Type":"ContainerStarted","Data":"3f1c45ccb6439ac7d2d6a6db9c268832b5d78d0eed85adf6fb6bb04a668076d7"} Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.975302 4666 generic.go:334] "Generic (PLEG): container finished" podID="4e039eec-3ce6-475e-9e89-8dc64fd04701" containerID="e45773486c0d0fe0d885f6457f27639a4cfe798b3211458093c00e62cadc78a0" exitCode=0 Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.975316 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v6kg" event={"ID":"4e039eec-3ce6-475e-9e89-8dc64fd04701","Type":"ContainerDied","Data":"e45773486c0d0fe0d885f6457f27639a4cfe798b3211458093c00e62cadc78a0"} Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.979452 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7r4t" event={"ID":"c9d16e83-a84e-4114-845d-a4fdf158100e","Type":"ContainerStarted","Data":"11e2ac763f7abfd9e34ab4c969582be3c39d242e3ceedd1e464a96b4281cba9f"} Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.983493 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fsqg" event={"ID":"8229727b-4723-46f6-919d-1eb721caefd1","Type":"ContainerStarted","Data":"a9b42df2e41609d575ab527d5b9ef1e0a3dbe620ceb908a47afef589ea8e8bda"} Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.986162 4666 generic.go:334] "Generic (PLEG): container finished" podID="82ff889b-3fab-481d-b9a0-36991fb87e8f" containerID="7fcfd24cd07f71a51702d8c2c3d6c06cf195ea28d6cdf4b3deb1fc42dcbfbf34" exitCode=0 Dec 03 12:27:55 crc kubenswrapper[4666]: I1203 12:27:55.986206 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwb2h" event={"ID":"82ff889b-3fab-481d-b9a0-36991fb87e8f","Type":"ContainerDied","Data":"7fcfd24cd07f71a51702d8c2c3d6c06cf195ea28d6cdf4b3deb1fc42dcbfbf34"} Dec 03 12:27:56 crc kubenswrapper[4666]: I1203 12:27:56.032068 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2fsqg" podStartSLOduration=1.857751614 podStartE2EDuration="5.032046519s" podCreationTimestamp="2025-12-03 12:27:51 +0000 UTC" firstStartedPulling="2025-12-03 12:27:51.888726581 +0000 UTC m=+860.733687632" lastFinishedPulling="2025-12-03 12:27:55.063021486 +0000 UTC m=+863.907982537" observedRunningTime="2025-12-03 12:27:56.027963288 +0000 UTC m=+864.872924339" watchObservedRunningTime="2025-12-03 12:27:56.032046519 +0000 UTC m=+864.877007570" Dec 03 12:27:56 crc kubenswrapper[4666]: I1203 12:27:56.060111 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r7r4t" podStartSLOduration=2.403689167 podStartE2EDuration="5.060066089s" podCreationTimestamp="2025-12-03 12:27:51 +0000 UTC" firstStartedPulling="2025-12-03 12:27:52.896400902 +0000 UTC m=+861.741361953" lastFinishedPulling="2025-12-03 12:27:55.552777824 +0000 UTC m=+864.397738875" observedRunningTime="2025-12-03 12:27:56.054675753 +0000 UTC m=+864.899636814" watchObservedRunningTime="2025-12-03 12:27:56.060066089 +0000 UTC m=+864.905027150" Dec 03 12:27:56 crc kubenswrapper[4666]: I1203 12:27:56.097324 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7889c" podStartSLOduration=2.183410474 podStartE2EDuration="5.097288638s" podCreationTimestamp="2025-12-03 12:27:51 +0000 UTC" firstStartedPulling="2025-12-03 12:27:52.892705992 +0000 UTC m=+861.737667043" lastFinishedPulling="2025-12-03 12:27:55.806584156 +0000 UTC m=+864.651545207" observedRunningTime="2025-12-03 12:27:56.093446394 +0000 UTC m=+864.938407445" watchObservedRunningTime="2025-12-03 12:27:56.097288638 +0000 UTC m=+864.942249689" Dec 03 12:27:56 crc kubenswrapper[4666]: I1203 12:27:56.996408 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwb2h" event={"ID":"82ff889b-3fab-481d-b9a0-36991fb87e8f","Type":"ContainerStarted","Data":"deaa3d650564053cb63876dfb05bb9dd538d9a08a36bd256fd5c2ae5832618c4"} Dec 03 12:27:57 crc kubenswrapper[4666]: I1203 12:27:57.018288 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vwb2h" podStartSLOduration=2.408334303 podStartE2EDuration="6.018268749s" podCreationTimestamp="2025-12-03 12:27:51 +0000 UTC" firstStartedPulling="2025-12-03 12:27:52.902017765 +0000 UTC m=+861.746978816" lastFinishedPulling="2025-12-03 12:27:56.511952191 +0000 UTC m=+865.356913262" observedRunningTime="2025-12-03 12:27:57.016021168 +0000 UTC m=+865.860982219" watchObservedRunningTime="2025-12-03 12:27:57.018268749 +0000 UTC m=+865.863229800" Dec 03 12:27:58 crc kubenswrapper[4666]: I1203 12:27:58.006415 4666 generic.go:334] "Generic (PLEG): container finished" podID="74eb5845-66cb-4c54-a2c7-53f59a686e0d" containerID="5112a7f5a0aa2c5a8467ea1813a024a159ece66ded6c037c93934dadf026c59e" exitCode=0 Dec 03 12:27:58 crc kubenswrapper[4666]: I1203 12:27:58.006523 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqvpl" event={"ID":"74eb5845-66cb-4c54-a2c7-53f59a686e0d","Type":"ContainerDied","Data":"5112a7f5a0aa2c5a8467ea1813a024a159ece66ded6c037c93934dadf026c59e"} Dec 03 12:27:58 crc kubenswrapper[4666]: I1203 12:27:58.010011 4666 generic.go:334] "Generic (PLEG): container finished" podID="4e039eec-3ce6-475e-9e89-8dc64fd04701" containerID="1647bf2109443c803cb7377e1e095d7420ce23bf46fb154628009ea21f4a6683" exitCode=0 Dec 03 12:27:58 crc kubenswrapper[4666]: I1203 12:27:58.010107 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v6kg" event={"ID":"4e039eec-3ce6-475e-9e89-8dc64fd04701","Type":"ContainerDied","Data":"1647bf2109443c803cb7377e1e095d7420ce23bf46fb154628009ea21f4a6683"} Dec 03 12:27:58 crc kubenswrapper[4666]: I1203 12:27:58.893295 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:58 crc kubenswrapper[4666]: I1203 12:27:58.894474 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:58 crc kubenswrapper[4666]: I1203 12:27:58.940228 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:59 crc kubenswrapper[4666]: I1203 12:27:59.060334 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:27:59 crc kubenswrapper[4666]: I1203 12:27:59.237250 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:59 crc kubenswrapper[4666]: I1203 12:27:59.237311 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:27:59 crc kubenswrapper[4666]: I1203 12:27:59.282359 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:28:00 crc kubenswrapper[4666]: I1203 12:28:00.064724 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.032865 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqvpl" event={"ID":"74eb5845-66cb-4c54-a2c7-53f59a686e0d","Type":"ContainerStarted","Data":"8583b4559245c860715495057680a182b55540cfd8e162a8819407fe46fed733"} Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.036239 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v6kg" event={"ID":"4e039eec-3ce6-475e-9e89-8dc64fd04701","Type":"ContainerStarted","Data":"5606d157a938aa3137d1812cb2dae520e45a33364b10f35a01aa5e3b97ef8ddd"} Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.062044 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lqvpl" podStartSLOduration=3.586836092 podStartE2EDuration="8.062011708s" podCreationTimestamp="2025-12-03 12:27:53 +0000 UTC" firstStartedPulling="2025-12-03 12:27:55.968781264 +0000 UTC m=+864.813742315" lastFinishedPulling="2025-12-03 12:28:00.44395688 +0000 UTC m=+869.288917931" observedRunningTime="2025-12-03 12:28:01.055874801 +0000 UTC m=+869.900835862" watchObservedRunningTime="2025-12-03 12:28:01.062011708 +0000 UTC m=+869.906972759" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.080720 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5v6kg" podStartSLOduration=3.86143928 podStartE2EDuration="7.080701484s" podCreationTimestamp="2025-12-03 12:27:54 +0000 UTC" firstStartedPulling="2025-12-03 12:27:55.976500173 +0000 UTC m=+864.821461224" lastFinishedPulling="2025-12-03 12:27:59.195762377 +0000 UTC m=+868.040723428" observedRunningTime="2025-12-03 12:28:01.07796193 +0000 UTC m=+869.922922971" watchObservedRunningTime="2025-12-03 12:28:01.080701484 +0000 UTC m=+869.925662525" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.414400 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.414553 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.469439 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.670695 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.670799 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.816576 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.816664 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:28:01 crc kubenswrapper[4666]: I1203 12:28:01.872888 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.078693 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.078822 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.089602 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2fsqg" Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.092650 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.128525 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.272455 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qjsp"] Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.273066 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6qjsp" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="registry-server" containerID="cri-o://71dfff68d624b7706c9b0d03f34c827c265a6ef90b44b42de99dc19f7c12bd8b" gracePeriod=2 Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.475678 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbj9s"] Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.476495 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rbj9s" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="registry-server" containerID="cri-o://8311e29477794f1755340535271c552d403ce65a9d8532930b041331dd155f87" gracePeriod=2 Dec 03 12:28:02 crc kubenswrapper[4666]: I1203 12:28:02.735572 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vwb2h" podUID="82ff889b-3fab-481d-b9a0-36991fb87e8f" containerName="registry-server" probeResult="failure" output=< Dec 03 12:28:02 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:28:02 crc kubenswrapper[4666]: > Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.050485 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerID="71dfff68d624b7706c9b0d03f34c827c265a6ef90b44b42de99dc19f7c12bd8b" exitCode=0 Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.050892 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qjsp" event={"ID":"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847","Type":"ContainerDied","Data":"71dfff68d624b7706c9b0d03f34c827c265a6ef90b44b42de99dc19f7c12bd8b"} Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.103622 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.354608 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.454683 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl79l\" (UniqueName: \"kubernetes.io/projected/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-kube-api-access-wl79l\") pod \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.454773 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-utilities\") pod \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.455014 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-catalog-content\") pod \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\" (UID: \"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847\") " Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.456208 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-utilities" (OuterVolumeSpecName: "utilities") pod "5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" (UID: "5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.463174 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-kube-api-access-wl79l" (OuterVolumeSpecName: "kube-api-access-wl79l") pod "5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" (UID: "5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847"). InnerVolumeSpecName "kube-api-access-wl79l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.507664 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" (UID: "5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.556485 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.556540 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:03 crc kubenswrapper[4666]: I1203 12:28:03.556554 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl79l\" (UniqueName: \"kubernetes.io/projected/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847-kube-api-access-wl79l\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.059003 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qjsp" event={"ID":"5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847","Type":"ContainerDied","Data":"77316fe0f74aefed2c83fd63159f070b9dd04ab1a6ca440870d4fc67bf38b2c3"} Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.059075 4666 scope.go:117] "RemoveContainer" containerID="71dfff68d624b7706c9b0d03f34c827c265a6ef90b44b42de99dc19f7c12bd8b" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.059322 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qjsp" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.068992 4666 generic.go:334] "Generic (PLEG): container finished" podID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerID="8311e29477794f1755340535271c552d403ce65a9d8532930b041331dd155f87" exitCode=0 Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.069779 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbj9s" event={"ID":"46838e1f-61c7-4f5c-ad1f-240091d5df80","Type":"ContainerDied","Data":"8311e29477794f1755340535271c552d403ce65a9d8532930b041331dd155f87"} Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.085953 4666 scope.go:117] "RemoveContainer" containerID="26d7033e9087ef8afc5f705ea17a0662246723093cb4c9e8b067567d6b6ba5ce" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.097448 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qjsp"] Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.108336 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6qjsp"] Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.118419 4666 scope.go:117] "RemoveContainer" containerID="a08fb3d19095f4470080867022bad4f5d6779422d78bd84b5a70e0dd5a6aa012" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.234307 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.234515 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.274352 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.402241 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.402688 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.440309 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.694620 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7889c"] Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.695025 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7889c" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="registry-server" containerID="cri-o://3f1c45ccb6439ac7d2d6a6db9c268832b5d78d0eed85adf6fb6bb04a668076d7" gracePeriod=2 Dec 03 12:28:04 crc kubenswrapper[4666]: I1203 12:28:04.879066 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r7r4t"] Dec 03 12:28:05 crc kubenswrapper[4666]: I1203 12:28:05.125679 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lqvpl" Dec 03 12:28:05 crc kubenswrapper[4666]: I1203 12:28:05.126278 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5v6kg" Dec 03 12:28:05 crc kubenswrapper[4666]: I1203 12:28:05.431911 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" path="/var/lib/kubelet/pods/5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847/volumes" Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.100837 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r7r4t" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="registry-server" containerID="cri-o://11e2ac763f7abfd9e34ab4c969582be3c39d242e3ceedd1e464a96b4281cba9f" gracePeriod=2 Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.149988 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.299972 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-utilities\") pod \"46838e1f-61c7-4f5c-ad1f-240091d5df80\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.300210 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsg8h\" (UniqueName: \"kubernetes.io/projected/46838e1f-61c7-4f5c-ad1f-240091d5df80-kube-api-access-vsg8h\") pod \"46838e1f-61c7-4f5c-ad1f-240091d5df80\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.300376 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-catalog-content\") pod \"46838e1f-61c7-4f5c-ad1f-240091d5df80\" (UID: \"46838e1f-61c7-4f5c-ad1f-240091d5df80\") " Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.301591 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-utilities" (OuterVolumeSpecName: "utilities") pod "46838e1f-61c7-4f5c-ad1f-240091d5df80" (UID: "46838e1f-61c7-4f5c-ad1f-240091d5df80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.308410 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46838e1f-61c7-4f5c-ad1f-240091d5df80-kube-api-access-vsg8h" (OuterVolumeSpecName: "kube-api-access-vsg8h") pod "46838e1f-61c7-4f5c-ad1f-240091d5df80" (UID: "46838e1f-61c7-4f5c-ad1f-240091d5df80"). InnerVolumeSpecName "kube-api-access-vsg8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.362178 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46838e1f-61c7-4f5c-ad1f-240091d5df80" (UID: "46838e1f-61c7-4f5c-ad1f-240091d5df80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.403173 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.403230 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46838e1f-61c7-4f5c-ad1f-240091d5df80-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:06 crc kubenswrapper[4666]: I1203 12:28:06.403291 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsg8h\" (UniqueName: \"kubernetes.io/projected/46838e1f-61c7-4f5c-ad1f-240091d5df80-kube-api-access-vsg8h\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.111173 4666 generic.go:334] "Generic (PLEG): container finished" podID="ece082ea-3119-43c4-9d36-152e62629542" containerID="3f1c45ccb6439ac7d2d6a6db9c268832b5d78d0eed85adf6fb6bb04a668076d7" exitCode=0 Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.111300 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7889c" event={"ID":"ece082ea-3119-43c4-9d36-152e62629542","Type":"ContainerDied","Data":"3f1c45ccb6439ac7d2d6a6db9c268832b5d78d0eed85adf6fb6bb04a668076d7"} Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.115284 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbj9s" Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.115265 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbj9s" event={"ID":"46838e1f-61c7-4f5c-ad1f-240091d5df80","Type":"ContainerDied","Data":"31b6d2921df49e31b6a022c2149af1859349f15284542a4caa32ea27ea34ec93"} Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.115718 4666 scope.go:117] "RemoveContainer" containerID="8311e29477794f1755340535271c552d403ce65a9d8532930b041331dd155f87" Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.142534 4666 scope.go:117] "RemoveContainer" containerID="a07a8ea57064b9f59e72c358631ec4566f2da59fed83e908f0c375ca3781775a" Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.175201 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbj9s"] Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.176246 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rbj9s"] Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.185847 4666 scope.go:117] "RemoveContainer" containerID="593d1884d7db4c54b685b02928568029e3a91f88d3b5120d64259f326de58be9" Dec 03 12:28:07 crc kubenswrapper[4666]: I1203 12:28:07.434521 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" path="/var/lib/kubelet/pods/46838e1f-61c7-4f5c-ad1f-240091d5df80/volumes" Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.136103 4666 generic.go:334] "Generic (PLEG): container finished" podID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerID="11e2ac763f7abfd9e34ab4c969582be3c39d242e3ceedd1e464a96b4281cba9f" exitCode=0 Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.136157 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7r4t" event={"ID":"c9d16e83-a84e-4114-845d-a4fdf158100e","Type":"ContainerDied","Data":"11e2ac763f7abfd9e34ab4c969582be3c39d242e3ceedd1e464a96b4281cba9f"} Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.692482 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.740864 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-catalog-content\") pod \"ece082ea-3119-43c4-9d36-152e62629542\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.741221 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-utilities\") pod \"ece082ea-3119-43c4-9d36-152e62629542\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.741403 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln46p\" (UniqueName: \"kubernetes.io/projected/ece082ea-3119-43c4-9d36-152e62629542-kube-api-access-ln46p\") pod \"ece082ea-3119-43c4-9d36-152e62629542\" (UID: \"ece082ea-3119-43c4-9d36-152e62629542\") " Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.741835 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-utilities" (OuterVolumeSpecName: "utilities") pod "ece082ea-3119-43c4-9d36-152e62629542" (UID: "ece082ea-3119-43c4-9d36-152e62629542"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.745888 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece082ea-3119-43c4-9d36-152e62629542-kube-api-access-ln46p" (OuterVolumeSpecName: "kube-api-access-ln46p") pod "ece082ea-3119-43c4-9d36-152e62629542" (UID: "ece082ea-3119-43c4-9d36-152e62629542"). InnerVolumeSpecName "kube-api-access-ln46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.759794 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ece082ea-3119-43c4-9d36-152e62629542" (UID: "ece082ea-3119-43c4-9d36-152e62629542"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.843559 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.843905 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece082ea-3119-43c4-9d36-152e62629542-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:08 crc kubenswrapper[4666]: I1203 12:28:08.843974 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln46p\" (UniqueName: \"kubernetes.io/projected/ece082ea-3119-43c4-9d36-152e62629542-kube-api-access-ln46p\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.144039 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7889c" event={"ID":"ece082ea-3119-43c4-9d36-152e62629542","Type":"ContainerDied","Data":"cffc922f40211b84a3f404b465c14bd6f29dd9b43cda464b2bf7a4a71260daab"} Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.144150 4666 scope.go:117] "RemoveContainer" containerID="3f1c45ccb6439ac7d2d6a6db9c268832b5d78d0eed85adf6fb6bb04a668076d7" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.144315 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7889c" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.148555 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.153312 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7r4t" event={"ID":"c9d16e83-a84e-4114-845d-a4fdf158100e","Type":"ContainerDied","Data":"a213228e6968b1b985444f4c5a4927bf3a149a7df528729c396ccee4f1271b4a"} Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.169082 4666 scope.go:117] "RemoveContainer" containerID="cfe8dca8bfe8deb848519c36d10bc9dcb940acd7bb8bd19290bec85caa55af6d" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.219455 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7889c"] Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.224797 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7889c"] Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.251881 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-utilities\") pod \"c9d16e83-a84e-4114-845d-a4fdf158100e\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.251958 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5xd8\" (UniqueName: \"kubernetes.io/projected/c9d16e83-a84e-4114-845d-a4fdf158100e-kube-api-access-p5xd8\") pod \"c9d16e83-a84e-4114-845d-a4fdf158100e\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.252042 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-catalog-content\") pod \"c9d16e83-a84e-4114-845d-a4fdf158100e\" (UID: \"c9d16e83-a84e-4114-845d-a4fdf158100e\") " Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.253200 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-utilities" (OuterVolumeSpecName: "utilities") pod "c9d16e83-a84e-4114-845d-a4fdf158100e" (UID: "c9d16e83-a84e-4114-845d-a4fdf158100e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.258805 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d16e83-a84e-4114-845d-a4fdf158100e-kube-api-access-p5xd8" (OuterVolumeSpecName: "kube-api-access-p5xd8") pod "c9d16e83-a84e-4114-845d-a4fdf158100e" (UID: "c9d16e83-a84e-4114-845d-a4fdf158100e"). InnerVolumeSpecName "kube-api-access-p5xd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.353902 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.354246 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5xd8\" (UniqueName: \"kubernetes.io/projected/c9d16e83-a84e-4114-845d-a4fdf158100e-kube-api-access-p5xd8\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.381340 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9d16e83-a84e-4114-845d-a4fdf158100e" (UID: "c9d16e83-a84e-4114-845d-a4fdf158100e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.431694 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece082ea-3119-43c4-9d36-152e62629542" path="/var/lib/kubelet/pods/ece082ea-3119-43c4-9d36-152e62629542/volumes" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.455889 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d16e83-a84e-4114-845d-a4fdf158100e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.467399 4666 scope.go:117] "RemoveContainer" containerID="00ffecfca5d56b31140c0fe4a1f00c6ff0a26f084945783f97549665617eec3d" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.492075 4666 scope.go:117] "RemoveContainer" containerID="11e2ac763f7abfd9e34ab4c969582be3c39d242e3ceedd1e464a96b4281cba9f" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.509400 4666 scope.go:117] "RemoveContainer" containerID="8af69852f3b6ce0b4d277d22f2b0fec62c40d6020c6d89179ce74c54c44e90e8" Dec 03 12:28:09 crc kubenswrapper[4666]: I1203 12:28:09.529511 4666 scope.go:117] "RemoveContainer" containerID="9362d9eb8e6505523ff31ef63d3f467e0cbdf74f3fb4da0ad4209e11c7f41a48" Dec 03 12:28:10 crc kubenswrapper[4666]: I1203 12:28:10.181693 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7r4t" Dec 03 12:28:10 crc kubenswrapper[4666]: I1203 12:28:10.210102 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r7r4t"] Dec 03 12:28:10 crc kubenswrapper[4666]: I1203 12:28:10.215520 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r7r4t"] Dec 03 12:28:11 crc kubenswrapper[4666]: I1203 12:28:11.442205 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" path="/var/lib/kubelet/pods/c9d16e83-a84e-4114-845d-a4fdf158100e/volumes" Dec 03 12:28:11 crc kubenswrapper[4666]: I1203 12:28:11.712477 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:28:11 crc kubenswrapper[4666]: I1203 12:28:11.781718 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vwb2h" Dec 03 12:29:09 crc kubenswrapper[4666]: I1203 12:29:09.602846 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv5pm"] Dec 03 12:29:09 crc kubenswrapper[4666]: I1203 12:29:09.603890 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" podUID="ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" containerName="controller-manager" containerID="cri-o://07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c" gracePeriod=30 Dec 03 12:29:09 crc kubenswrapper[4666]: I1203 12:29:09.704041 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd"] Dec 03 12:29:09 crc kubenswrapper[4666]: I1203 12:29:09.704398 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" podUID="7583101e-f814-41d1-9b78-086c48e16385" containerName="route-controller-manager" containerID="cri-o://5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627" gracePeriod=30 Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.025050 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.082828 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.202863 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-client-ca\") pod \"7583101e-f814-41d1-9b78-086c48e16385\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.202928 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwxmk\" (UniqueName: \"kubernetes.io/projected/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-kube-api-access-vwxmk\") pod \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.202973 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-config\") pod \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.202999 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-proxy-ca-bundles\") pod \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.203042 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-serving-cert\") pod \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.203070 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583101e-f814-41d1-9b78-086c48e16385-serving-cert\") pod \"7583101e-f814-41d1-9b78-086c48e16385\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.203114 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmrpz\" (UniqueName: \"kubernetes.io/projected/7583101e-f814-41d1-9b78-086c48e16385-kube-api-access-vmrpz\") pod \"7583101e-f814-41d1-9b78-086c48e16385\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.203154 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-client-ca\") pod \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\" (UID: \"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.203201 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-config\") pod \"7583101e-f814-41d1-9b78-086c48e16385\" (UID: \"7583101e-f814-41d1-9b78-086c48e16385\") " Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.204436 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-config" (OuterVolumeSpecName: "config") pod "7583101e-f814-41d1-9b78-086c48e16385" (UID: "7583101e-f814-41d1-9b78-086c48e16385"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.204685 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583101e-f814-41d1-9b78-086c48e16385" (UID: "7583101e-f814-41d1-9b78-086c48e16385"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.205866 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" (UID: "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.206025 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" (UID: "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.206055 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-config" (OuterVolumeSpecName: "config") pod "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" (UID: "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.211240 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" (UID: "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.211460 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-kube-api-access-vwxmk" (OuterVolumeSpecName: "kube-api-access-vwxmk") pod "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" (UID: "ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc"). InnerVolumeSpecName "kube-api-access-vwxmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.211707 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583101e-f814-41d1-9b78-086c48e16385-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583101e-f814-41d1-9b78-086c48e16385" (UID: "7583101e-f814-41d1-9b78-086c48e16385"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.211744 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583101e-f814-41d1-9b78-086c48e16385-kube-api-access-vmrpz" (OuterVolumeSpecName: "kube-api-access-vmrpz") pod "7583101e-f814-41d1-9b78-086c48e16385" (UID: "7583101e-f814-41d1-9b78-086c48e16385"). InnerVolumeSpecName "kube-api-access-vmrpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304544 4666 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304591 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwxmk\" (UniqueName: \"kubernetes.io/projected/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-kube-api-access-vwxmk\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304605 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304614 4666 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304622 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304641 4666 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583101e-f814-41d1-9b78-086c48e16385-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304650 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmrpz\" (UniqueName: \"kubernetes.io/projected/7583101e-f814-41d1-9b78-086c48e16385-kube-api-access-vmrpz\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304660 4666 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.304668 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583101e-f814-41d1-9b78-086c48e16385-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.549238 4666 generic.go:334] "Generic (PLEG): container finished" podID="7583101e-f814-41d1-9b78-086c48e16385" containerID="5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627" exitCode=0 Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.549307 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" event={"ID":"7583101e-f814-41d1-9b78-086c48e16385","Type":"ContainerDied","Data":"5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627"} Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.549382 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" event={"ID":"7583101e-f814-41d1-9b78-086c48e16385","Type":"ContainerDied","Data":"f893e7755991ae31c02a8e139f24f5295ef678fbc02109da5eca9af8791eaa1d"} Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.549406 4666 scope.go:117] "RemoveContainer" containerID="5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.549340 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.552578 4666 generic.go:334] "Generic (PLEG): container finished" podID="ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" containerID="07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c" exitCode=0 Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.552656 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.552645 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" event={"ID":"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc","Type":"ContainerDied","Data":"07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c"} Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.552931 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv5pm" event={"ID":"ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc","Type":"ContainerDied","Data":"9eeaa989eb5c6ae64cb7f0f19578bd8a048aabfa330264835fdec194399cd4a4"} Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.572197 4666 scope.go:117] "RemoveContainer" containerID="5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627" Dec 03 12:29:10 crc kubenswrapper[4666]: E1203 12:29:10.573329 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627\": container with ID starting with 5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627 not found: ID does not exist" containerID="5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.573482 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627"} err="failed to get container status \"5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627\": rpc error: code = NotFound desc = could not find container \"5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627\": container with ID starting with 5f35366257de53b6b271e05fc2977bbf26e0d414822bf01e9b2c071afeb34627 not found: ID does not exist" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.573642 4666 scope.go:117] "RemoveContainer" containerID="07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.593689 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd"] Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.596664 4666 scope.go:117] "RemoveContainer" containerID="07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.596945 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vdcsd"] Dec 03 12:29:10 crc kubenswrapper[4666]: E1203 12:29:10.597232 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c\": container with ID starting with 07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c not found: ID does not exist" containerID="07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.597275 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c"} err="failed to get container status \"07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c\": rpc error: code = NotFound desc = could not find container \"07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c\": container with ID starting with 07fed01e70fce3495ea3625f2416a50bcf200905018e2d5eebfc168ad626869c not found: ID does not exist" Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.608998 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv5pm"] Dec 03 12:29:10 crc kubenswrapper[4666]: I1203 12:29:10.612888 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv5pm"] Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.316395 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58499495cf-gspr9"] Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317286 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7583101e-f814-41d1-9b78-086c48e16385" containerName="route-controller-manager" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317304 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="7583101e-f814-41d1-9b78-086c48e16385" containerName="route-controller-manager" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317316 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317324 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317340 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317350 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317361 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317368 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317381 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" containerName="controller-manager" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317388 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" containerName="controller-manager" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317397 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317404 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317419 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317426 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317439 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317447 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317459 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317466 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317476 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317484 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317497 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317504 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317511 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317518 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317525 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317532 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="extract-utilities" Dec 03 12:29:11 crc kubenswrapper[4666]: E1203 12:29:11.317542 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317548 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="extract-content" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317663 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" containerName="controller-manager" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317676 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="46838e1f-61c7-4f5c-ad1f-240091d5df80" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317688 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d16e83-a84e-4114-845d-a4fdf158100e" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317699 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="7583101e-f814-41d1-9b78-086c48e16385" containerName="route-controller-manager" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317710 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c14f5eb-2fd9-4ecb-8e0e-7e1d8fddf847" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.317720 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece082ea-3119-43c4-9d36-152e62629542" containerName="registry-server" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.318351 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.321774 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.321875 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.321774 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.322077 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.322279 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.322891 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.325171 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66"] Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.328322 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.334794 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.334809 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.335149 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.335195 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.335297 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.335392 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.335529 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.336012 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66"] Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.343628 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58499495cf-gspr9"] Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420228 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-serving-cert\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420307 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-config\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420335 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-proxy-ca-bundles\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420368 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-config\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420392 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dzt9\" (UniqueName: \"kubernetes.io/projected/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-kube-api-access-8dzt9\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420552 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a27f74-e678-4a44-9c9c-89b73440e25a-serving-cert\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420655 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-client-ca\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420694 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-client-ca\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.420712 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s228k\" (UniqueName: \"kubernetes.io/projected/69a27f74-e678-4a44-9c9c-89b73440e25a-kube-api-access-s228k\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.430799 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583101e-f814-41d1-9b78-086c48e16385" path="/var/lib/kubelet/pods/7583101e-f814-41d1-9b78-086c48e16385/volumes" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.431548 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc" path="/var/lib/kubelet/pods/ebe4dc75-d4fe-4fb4-9df5-f4def2b1e1dc/volumes" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522497 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-client-ca\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522558 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-client-ca\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522583 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s228k\" (UniqueName: \"kubernetes.io/projected/69a27f74-e678-4a44-9c9c-89b73440e25a-kube-api-access-s228k\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522612 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-serving-cert\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522651 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-config\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522670 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-proxy-ca-bundles\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522695 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-config\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522716 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dzt9\" (UniqueName: \"kubernetes.io/projected/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-kube-api-access-8dzt9\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.522752 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a27f74-e678-4a44-9c9c-89b73440e25a-serving-cert\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.524000 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-client-ca\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.524001 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-client-ca\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.524913 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-config\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.525079 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69a27f74-e678-4a44-9c9c-89b73440e25a-proxy-ca-bundles\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.525392 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-config\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.528709 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69a27f74-e678-4a44-9c9c-89b73440e25a-serving-cert\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.528706 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-serving-cert\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.547730 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dzt9\" (UniqueName: \"kubernetes.io/projected/49f0fe53-e4b3-4c56-b0a0-06988f3e6f11-kube-api-access-8dzt9\") pod \"route-controller-manager-d5584c5db-cwx66\" (UID: \"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11\") " pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.548009 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s228k\" (UniqueName: \"kubernetes.io/projected/69a27f74-e678-4a44-9c9c-89b73440e25a-kube-api-access-s228k\") pod \"controller-manager-58499495cf-gspr9\" (UID: \"69a27f74-e678-4a44-9c9c-89b73440e25a\") " pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.648024 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.658684 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.949713 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58499495cf-gspr9"] Dec 03 12:29:11 crc kubenswrapper[4666]: I1203 12:29:11.983394 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66"] Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.566814 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" event={"ID":"69a27f74-e678-4a44-9c9c-89b73440e25a","Type":"ContainerStarted","Data":"910a6e21b895c72ea1617d5342cc6f210241424bb9031fec84cfe47af09041fb"} Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.566881 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" event={"ID":"69a27f74-e678-4a44-9c9c-89b73440e25a","Type":"ContainerStarted","Data":"91816a9430e8359f32aed586fa8d4c577c5e37002246e8dcbc3fbe39f218ba52"} Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.567289 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.569979 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" event={"ID":"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11","Type":"ContainerStarted","Data":"1d84ec67278f6a9c7e7fd22d84b00d15d824e9a9f83501be845252c145a824d4"} Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.570024 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" event={"ID":"49f0fe53-e4b3-4c56-b0a0-06988f3e6f11","Type":"ContainerStarted","Data":"94e0be5ca07da86220983f58069f2993d4b93347d8bfd6bb2438a9d0468e63fc"} Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.570219 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.573715 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.576168 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.623637 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58499495cf-gspr9" podStartSLOduration=3.623611091 podStartE2EDuration="3.623611091s" podCreationTimestamp="2025-12-03 12:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:29:12.600749271 +0000 UTC m=+941.445710322" watchObservedRunningTime="2025-12-03 12:29:12.623611091 +0000 UTC m=+941.468572142" Dec 03 12:29:12 crc kubenswrapper[4666]: I1203 12:29:12.623790 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d5584c5db-cwx66" podStartSLOduration=3.623784236 podStartE2EDuration="3.623784236s" podCreationTimestamp="2025-12-03 12:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:29:12.620503627 +0000 UTC m=+941.465464688" watchObservedRunningTime="2025-12-03 12:29:12.623784236 +0000 UTC m=+941.468745297" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.178126 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt"] Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.181272 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.184486 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.184760 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.197853 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt"] Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.339454 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f78bd-4167-45f6-9b52-bbb47d6ce388-secret-volume\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.339541 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hnn\" (UniqueName: \"kubernetes.io/projected/274f78bd-4167-45f6-9b52-bbb47d6ce388-kube-api-access-r9hnn\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.339660 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f78bd-4167-45f6-9b52-bbb47d6ce388-config-volume\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.440639 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f78bd-4167-45f6-9b52-bbb47d6ce388-secret-volume\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.440819 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hnn\" (UniqueName: \"kubernetes.io/projected/274f78bd-4167-45f6-9b52-bbb47d6ce388-kube-api-access-r9hnn\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.440873 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f78bd-4167-45f6-9b52-bbb47d6ce388-config-volume\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.442995 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f78bd-4167-45f6-9b52-bbb47d6ce388-config-volume\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.449500 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f78bd-4167-45f6-9b52-bbb47d6ce388-secret-volume\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.462327 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hnn\" (UniqueName: \"kubernetes.io/projected/274f78bd-4167-45f6-9b52-bbb47d6ce388-kube-api-access-r9hnn\") pod \"collect-profiles-29412750-ds6lt\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.499742 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:00 crc kubenswrapper[4666]: I1203 12:30:00.915973 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt"] Dec 03 12:30:01 crc kubenswrapper[4666]: I1203 12:30:01.901432 4666 generic.go:334] "Generic (PLEG): container finished" podID="274f78bd-4167-45f6-9b52-bbb47d6ce388" containerID="f023f849ad36ddfea7f6a9cb8aef15f795eb86afe4becf441c67a153102fbd1a" exitCode=0 Dec 03 12:30:01 crc kubenswrapper[4666]: I1203 12:30:01.901509 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" event={"ID":"274f78bd-4167-45f6-9b52-bbb47d6ce388","Type":"ContainerDied","Data":"f023f849ad36ddfea7f6a9cb8aef15f795eb86afe4becf441c67a153102fbd1a"} Dec 03 12:30:01 crc kubenswrapper[4666]: I1203 12:30:01.901939 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" event={"ID":"274f78bd-4167-45f6-9b52-bbb47d6ce388","Type":"ContainerStarted","Data":"5ca59a54daeedc55361925d4a9bd9547f503d2d9885993f45eb868084812a8c4"} Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.197296 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.297891 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f78bd-4167-45f6-9b52-bbb47d6ce388-secret-volume\") pod \"274f78bd-4167-45f6-9b52-bbb47d6ce388\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.297981 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9hnn\" (UniqueName: \"kubernetes.io/projected/274f78bd-4167-45f6-9b52-bbb47d6ce388-kube-api-access-r9hnn\") pod \"274f78bd-4167-45f6-9b52-bbb47d6ce388\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.298112 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f78bd-4167-45f6-9b52-bbb47d6ce388-config-volume\") pod \"274f78bd-4167-45f6-9b52-bbb47d6ce388\" (UID: \"274f78bd-4167-45f6-9b52-bbb47d6ce388\") " Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.298755 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274f78bd-4167-45f6-9b52-bbb47d6ce388-config-volume" (OuterVolumeSpecName: "config-volume") pod "274f78bd-4167-45f6-9b52-bbb47d6ce388" (UID: "274f78bd-4167-45f6-9b52-bbb47d6ce388"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.305348 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f78bd-4167-45f6-9b52-bbb47d6ce388-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "274f78bd-4167-45f6-9b52-bbb47d6ce388" (UID: "274f78bd-4167-45f6-9b52-bbb47d6ce388"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.305764 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274f78bd-4167-45f6-9b52-bbb47d6ce388-kube-api-access-r9hnn" (OuterVolumeSpecName: "kube-api-access-r9hnn") pod "274f78bd-4167-45f6-9b52-bbb47d6ce388" (UID: "274f78bd-4167-45f6-9b52-bbb47d6ce388"). InnerVolumeSpecName "kube-api-access-r9hnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.399995 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f78bd-4167-45f6-9b52-bbb47d6ce388-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.400038 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9hnn\" (UniqueName: \"kubernetes.io/projected/274f78bd-4167-45f6-9b52-bbb47d6ce388-kube-api-access-r9hnn\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.400052 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f78bd-4167-45f6-9b52-bbb47d6ce388-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.916908 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" event={"ID":"274f78bd-4167-45f6-9b52-bbb47d6ce388","Type":"ContainerDied","Data":"5ca59a54daeedc55361925d4a9bd9547f503d2d9885993f45eb868084812a8c4"} Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.916969 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca59a54daeedc55361925d4a9bd9547f503d2d9885993f45eb868084812a8c4" Dec 03 12:30:03 crc kubenswrapper[4666]: I1203 12:30:03.917004 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt" Dec 03 12:30:09 crc kubenswrapper[4666]: I1203 12:30:09.867016 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:30:09 crc kubenswrapper[4666]: I1203 12:30:09.867644 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:30:39 crc kubenswrapper[4666]: I1203 12:30:39.866984 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:30:39 crc kubenswrapper[4666]: I1203 12:30:39.867727 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:31:09 crc kubenswrapper[4666]: I1203 12:31:09.866564 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:31:09 crc kubenswrapper[4666]: I1203 12:31:09.867567 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:31:09 crc kubenswrapper[4666]: I1203 12:31:09.867652 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:31:09 crc kubenswrapper[4666]: I1203 12:31:09.868778 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42bacd4c418c06abe4ddafebca16fbfc263995c2be08b6db1014fc4108c63522"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:31:09 crc kubenswrapper[4666]: I1203 12:31:09.868949 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://42bacd4c418c06abe4ddafebca16fbfc263995c2be08b6db1014fc4108c63522" gracePeriod=600 Dec 03 12:31:11 crc kubenswrapper[4666]: I1203 12:31:11.375028 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="42bacd4c418c06abe4ddafebca16fbfc263995c2be08b6db1014fc4108c63522" exitCode=0 Dec 03 12:31:11 crc kubenswrapper[4666]: I1203 12:31:11.375122 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"42bacd4c418c06abe4ddafebca16fbfc263995c2be08b6db1014fc4108c63522"} Dec 03 12:31:11 crc kubenswrapper[4666]: I1203 12:31:11.375487 4666 scope.go:117] "RemoveContainer" containerID="fa81dab43a4d71f3d67c6c619619b55815978bcd28afe3f39fd74796b1fa5fe6" Dec 03 12:31:12 crc kubenswrapper[4666]: I1203 12:31:12.383664 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"799f5558a2cf8c6592a511d64c8330b1e87c4e41d1366a8f10cad7a6b0d9cd8a"} Dec 03 12:33:39 crc kubenswrapper[4666]: I1203 12:33:39.866425 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:33:39 crc kubenswrapper[4666]: I1203 12:33:39.868580 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:34:09 crc kubenswrapper[4666]: I1203 12:34:09.866727 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:34:09 crc kubenswrapper[4666]: I1203 12:34:09.867663 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:34:39 crc kubenswrapper[4666]: I1203 12:34:39.866468 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:34:39 crc kubenswrapper[4666]: I1203 12:34:39.868747 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:34:39 crc kubenswrapper[4666]: I1203 12:34:39.868835 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:34:39 crc kubenswrapper[4666]: I1203 12:34:39.869690 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"799f5558a2cf8c6592a511d64c8330b1e87c4e41d1366a8f10cad7a6b0d9cd8a"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:34:39 crc kubenswrapper[4666]: I1203 12:34:39.869771 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://799f5558a2cf8c6592a511d64c8330b1e87c4e41d1366a8f10cad7a6b0d9cd8a" gracePeriod=600 Dec 03 12:34:40 crc kubenswrapper[4666]: I1203 12:34:40.665936 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="799f5558a2cf8c6592a511d64c8330b1e87c4e41d1366a8f10cad7a6b0d9cd8a" exitCode=0 Dec 03 12:34:40 crc kubenswrapper[4666]: I1203 12:34:40.666009 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"799f5558a2cf8c6592a511d64c8330b1e87c4e41d1366a8f10cad7a6b0d9cd8a"} Dec 03 12:34:40 crc kubenswrapper[4666]: I1203 12:34:40.666069 4666 scope.go:117] "RemoveContainer" containerID="42bacd4c418c06abe4ddafebca16fbfc263995c2be08b6db1014fc4108c63522" Dec 03 12:34:42 crc kubenswrapper[4666]: I1203 12:34:42.688902 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"ce96a0ab731e8d61bc9ca2f38ac40f0b4915f4493598279cd146755b95731fdb"} Dec 03 12:37:09 crc kubenswrapper[4666]: I1203 12:37:09.866607 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:37:09 crc kubenswrapper[4666]: I1203 12:37:09.867279 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:37:39 crc kubenswrapper[4666]: I1203 12:37:39.866694 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:37:39 crc kubenswrapper[4666]: I1203 12:37:39.868389 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.155678 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w6p79"] Dec 03 12:38:00 crc kubenswrapper[4666]: E1203 12:38:00.156760 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274f78bd-4167-45f6-9b52-bbb47d6ce388" containerName="collect-profiles" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.156780 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="274f78bd-4167-45f6-9b52-bbb47d6ce388" containerName="collect-profiles" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.156914 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="274f78bd-4167-45f6-9b52-bbb47d6ce388" containerName="collect-profiles" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.157949 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.171820 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6p79"] Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.342451 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-catalog-content\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.342515 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-utilities\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.342592 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnznr\" (UniqueName: \"kubernetes.io/projected/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-kube-api-access-lnznr\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.443806 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnznr\" (UniqueName: \"kubernetes.io/projected/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-kube-api-access-lnznr\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.443879 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-catalog-content\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.443905 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-utilities\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.444422 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-utilities\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.444871 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-catalog-content\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.467867 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnznr\" (UniqueName: \"kubernetes.io/projected/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-kube-api-access-lnznr\") pod \"certified-operators-w6p79\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.488694 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.757662 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6p79"] Dec 03 12:38:00 crc kubenswrapper[4666]: I1203 12:38:00.909403 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6p79" event={"ID":"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a","Type":"ContainerStarted","Data":"dc190fd6d88c02afa148bb0e885c72c24e19efcd1a59d572db63b29d317eab00"} Dec 03 12:38:01 crc kubenswrapper[4666]: I1203 12:38:01.918460 4666 generic.go:334] "Generic (PLEG): container finished" podID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerID="5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e" exitCode=0 Dec 03 12:38:01 crc kubenswrapper[4666]: I1203 12:38:01.918551 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6p79" event={"ID":"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a","Type":"ContainerDied","Data":"5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e"} Dec 03 12:38:01 crc kubenswrapper[4666]: I1203 12:38:01.922234 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:38:02 crc kubenswrapper[4666]: I1203 12:38:02.929124 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6p79" event={"ID":"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a","Type":"ContainerStarted","Data":"15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db"} Dec 03 12:38:03 crc kubenswrapper[4666]: I1203 12:38:03.950577 4666 generic.go:334] "Generic (PLEG): container finished" podID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerID="15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db" exitCode=0 Dec 03 12:38:03 crc kubenswrapper[4666]: I1203 12:38:03.950722 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6p79" event={"ID":"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a","Type":"ContainerDied","Data":"15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db"} Dec 03 12:38:04 crc kubenswrapper[4666]: I1203 12:38:04.961900 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6p79" event={"ID":"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a","Type":"ContainerStarted","Data":"9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4"} Dec 03 12:38:04 crc kubenswrapper[4666]: I1203 12:38:04.985114 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w6p79" podStartSLOduration=2.238200913 podStartE2EDuration="4.985079498s" podCreationTimestamp="2025-12-03 12:38:00 +0000 UTC" firstStartedPulling="2025-12-03 12:38:01.921923464 +0000 UTC m=+1470.766884515" lastFinishedPulling="2025-12-03 12:38:04.668802049 +0000 UTC m=+1473.513763100" observedRunningTime="2025-12-03 12:38:04.983027562 +0000 UTC m=+1473.827988633" watchObservedRunningTime="2025-12-03 12:38:04.985079498 +0000 UTC m=+1473.830040549" Dec 03 12:38:09 crc kubenswrapper[4666]: I1203 12:38:09.866764 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:38:09 crc kubenswrapper[4666]: I1203 12:38:09.868144 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:38:09 crc kubenswrapper[4666]: I1203 12:38:09.868296 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:38:09 crc kubenswrapper[4666]: I1203 12:38:09.869055 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce96a0ab731e8d61bc9ca2f38ac40f0b4915f4493598279cd146755b95731fdb"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:38:09 crc kubenswrapper[4666]: I1203 12:38:09.869225 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://ce96a0ab731e8d61bc9ca2f38ac40f0b4915f4493598279cd146755b95731fdb" gracePeriod=600 Dec 03 12:38:10 crc kubenswrapper[4666]: I1203 12:38:10.489136 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:10 crc kubenswrapper[4666]: I1203 12:38:10.489636 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:10 crc kubenswrapper[4666]: I1203 12:38:10.538649 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:10.999786 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="ce96a0ab731e8d61bc9ca2f38ac40f0b4915f4493598279cd146755b95731fdb" exitCode=0 Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:10.999872 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"ce96a0ab731e8d61bc9ca2f38ac40f0b4915f4493598279cd146755b95731fdb"} Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.000301 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69"} Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.000329 4666 scope.go:117] "RemoveContainer" containerID="799f5558a2cf8c6592a511d64c8330b1e87c4e41d1366a8f10cad7a6b0d9cd8a" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.052167 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.144135 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6p79"] Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.624631 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-g7hwp"] Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.626048 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.628643 4666 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-l87p5" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.628892 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.629660 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.635501 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-q94td"] Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.637063 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-q94td" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.640533 4666 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-5xlpv" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.643667 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-g7hwp"] Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.657832 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-q94td"] Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.660957 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nf766"] Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.661726 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.686797 4666 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7f5w2" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.688667 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nf766"] Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.807739 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bft4k\" (UniqueName: \"kubernetes.io/projected/b7fc3a3d-9867-4055-b071-c43574b66e7a-kube-api-access-bft4k\") pod \"cert-manager-webhook-5655c58dd6-nf766\" (UID: \"b7fc3a3d-9867-4055-b071-c43574b66e7a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.807816 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r442r\" (UniqueName: \"kubernetes.io/projected/100d1193-8c3e-442e-8e9c-9983b5292555-kube-api-access-r442r\") pod \"cert-manager-5b446d88c5-q94td\" (UID: \"100d1193-8c3e-442e-8e9c-9983b5292555\") " pod="cert-manager/cert-manager-5b446d88c5-q94td" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.808193 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvgpz\" (UniqueName: \"kubernetes.io/projected/9dafb972-28e7-4392-9d8f-0d6036c5adab-kube-api-access-mvgpz\") pod \"cert-manager-cainjector-7f985d654d-g7hwp\" (UID: \"9dafb972-28e7-4392-9d8f-0d6036c5adab\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.909648 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvgpz\" (UniqueName: \"kubernetes.io/projected/9dafb972-28e7-4392-9d8f-0d6036c5adab-kube-api-access-mvgpz\") pod \"cert-manager-cainjector-7f985d654d-g7hwp\" (UID: \"9dafb972-28e7-4392-9d8f-0d6036c5adab\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.909721 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bft4k\" (UniqueName: \"kubernetes.io/projected/b7fc3a3d-9867-4055-b071-c43574b66e7a-kube-api-access-bft4k\") pod \"cert-manager-webhook-5655c58dd6-nf766\" (UID: \"b7fc3a3d-9867-4055-b071-c43574b66e7a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.909754 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r442r\" (UniqueName: \"kubernetes.io/projected/100d1193-8c3e-442e-8e9c-9983b5292555-kube-api-access-r442r\") pod \"cert-manager-5b446d88c5-q94td\" (UID: \"100d1193-8c3e-442e-8e9c-9983b5292555\") " pod="cert-manager/cert-manager-5b446d88c5-q94td" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.940378 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bft4k\" (UniqueName: \"kubernetes.io/projected/b7fc3a3d-9867-4055-b071-c43574b66e7a-kube-api-access-bft4k\") pod \"cert-manager-webhook-5655c58dd6-nf766\" (UID: \"b7fc3a3d-9867-4055-b071-c43574b66e7a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.941034 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r442r\" (UniqueName: \"kubernetes.io/projected/100d1193-8c3e-442e-8e9c-9983b5292555-kube-api-access-r442r\") pod \"cert-manager-5b446d88c5-q94td\" (UID: \"100d1193-8c3e-442e-8e9c-9983b5292555\") " pod="cert-manager/cert-manager-5b446d88c5-q94td" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.949183 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvgpz\" (UniqueName: \"kubernetes.io/projected/9dafb972-28e7-4392-9d8f-0d6036c5adab-kube-api-access-mvgpz\") pod \"cert-manager-cainjector-7f985d654d-g7hwp\" (UID: \"9dafb972-28e7-4392-9d8f-0d6036c5adab\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.968467 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-q94td" Dec 03 12:38:11 crc kubenswrapper[4666]: I1203 12:38:11.981046 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" Dec 03 12:38:12 crc kubenswrapper[4666]: I1203 12:38:12.223416 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-q94td"] Dec 03 12:38:12 crc kubenswrapper[4666]: W1203 12:38:12.234838 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod100d1193_8c3e_442e_8e9c_9983b5292555.slice/crio-080d9d5c204d0cec2ed2d3030562cb92c53438d5d5f2bcfe02f2bdbafda1a44a WatchSource:0}: Error finding container 080d9d5c204d0cec2ed2d3030562cb92c53438d5d5f2bcfe02f2bdbafda1a44a: Status 404 returned error can't find the container with id 080d9d5c204d0cec2ed2d3030562cb92c53438d5d5f2bcfe02f2bdbafda1a44a Dec 03 12:38:12 crc kubenswrapper[4666]: I1203 12:38:12.244188 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" Dec 03 12:38:12 crc kubenswrapper[4666]: I1203 12:38:12.582172 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nf766"] Dec 03 12:38:12 crc kubenswrapper[4666]: W1203 12:38:12.598701 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7fc3a3d_9867_4055_b071_c43574b66e7a.slice/crio-282316ac1ed264d2e61af7ac8025431d80f5cc706983f3aacad71cfbf6f0c230 WatchSource:0}: Error finding container 282316ac1ed264d2e61af7ac8025431d80f5cc706983f3aacad71cfbf6f0c230: Status 404 returned error can't find the container with id 282316ac1ed264d2e61af7ac8025431d80f5cc706983f3aacad71cfbf6f0c230 Dec 03 12:38:12 crc kubenswrapper[4666]: I1203 12:38:12.685205 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-g7hwp"] Dec 03 12:38:12 crc kubenswrapper[4666]: W1203 12:38:12.695629 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dafb972_28e7_4392_9d8f_0d6036c5adab.slice/crio-a08196313a282973097f45ac97c4801d9cf8dc3c92c159f588be62b5d0152875 WatchSource:0}: Error finding container a08196313a282973097f45ac97c4801d9cf8dc3c92c159f588be62b5d0152875: Status 404 returned error can't find the container with id a08196313a282973097f45ac97c4801d9cf8dc3c92c159f588be62b5d0152875 Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.019116 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-q94td" event={"ID":"100d1193-8c3e-442e-8e9c-9983b5292555","Type":"ContainerStarted","Data":"080d9d5c204d0cec2ed2d3030562cb92c53438d5d5f2bcfe02f2bdbafda1a44a"} Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.020415 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" event={"ID":"b7fc3a3d-9867-4055-b071-c43574b66e7a","Type":"ContainerStarted","Data":"282316ac1ed264d2e61af7ac8025431d80f5cc706983f3aacad71cfbf6f0c230"} Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.021289 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" event={"ID":"9dafb972-28e7-4392-9d8f-0d6036c5adab","Type":"ContainerStarted","Data":"a08196313a282973097f45ac97c4801d9cf8dc3c92c159f588be62b5d0152875"} Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.021806 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w6p79" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="registry-server" containerID="cri-o://9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4" gracePeriod=2 Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.189907 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r6ft8"] Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.191689 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.204277 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r6ft8"] Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.335308 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-utilities\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.335908 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mwp\" (UniqueName: \"kubernetes.io/projected/b37520ad-5a38-43b8-ba00-071435281aa3-kube-api-access-b4mwp\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.335944 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-catalog-content\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.437988 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mwp\" (UniqueName: \"kubernetes.io/projected/b37520ad-5a38-43b8-ba00-071435281aa3-kube-api-access-b4mwp\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.438068 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-catalog-content\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.438215 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-utilities\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.438837 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-utilities\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.439516 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-catalog-content\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.463391 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mwp\" (UniqueName: \"kubernetes.io/projected/b37520ad-5a38-43b8-ba00-071435281aa3-kube-api-access-b4mwp\") pod \"community-operators-r6ft8\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.510936 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:13 crc kubenswrapper[4666]: I1203 12:38:13.972697 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.032667 4666 generic.go:334] "Generic (PLEG): container finished" podID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerID="9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4" exitCode=0 Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.032721 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6p79" event={"ID":"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a","Type":"ContainerDied","Data":"9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4"} Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.032755 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6p79" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.032798 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6p79" event={"ID":"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a","Type":"ContainerDied","Data":"dc190fd6d88c02afa148bb0e885c72c24e19efcd1a59d572db63b29d317eab00"} Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.032822 4666 scope.go:117] "RemoveContainer" containerID="9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.091570 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r6ft8"] Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.146775 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnznr\" (UniqueName: \"kubernetes.io/projected/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-kube-api-access-lnznr\") pod \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.146834 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-utilities\") pod \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.146869 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-catalog-content\") pod \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\" (UID: \"f9f05e87-d7ac-4f1f-816c-c0be06b66d7a\") " Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.148281 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-utilities" (OuterVolumeSpecName: "utilities") pod "f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" (UID: "f9f05e87-d7ac-4f1f-816c-c0be06b66d7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.151346 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-kube-api-access-lnznr" (OuterVolumeSpecName: "kube-api-access-lnznr") pod "f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" (UID: "f9f05e87-d7ac-4f1f-816c-c0be06b66d7a"). InnerVolumeSpecName "kube-api-access-lnznr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:38:14 crc kubenswrapper[4666]: W1203 12:38:14.194007 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37520ad_5a38_43b8_ba00_071435281aa3.slice/crio-0c92321415f77c76a0826f3cbbdedde33dcbb7615036bb7ed4742f79625a70bd WatchSource:0}: Error finding container 0c92321415f77c76a0826f3cbbdedde33dcbb7615036bb7ed4742f79625a70bd: Status 404 returned error can't find the container with id 0c92321415f77c76a0826f3cbbdedde33dcbb7615036bb7ed4742f79625a70bd Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.209763 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" (UID: "f9f05e87-d7ac-4f1f-816c-c0be06b66d7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.210577 4666 scope.go:117] "RemoveContainer" containerID="15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.234197 4666 scope.go:117] "RemoveContainer" containerID="5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.249061 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnznr\" (UniqueName: \"kubernetes.io/projected/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-kube-api-access-lnznr\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.249134 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.249150 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.250200 4666 scope.go:117] "RemoveContainer" containerID="9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4" Dec 03 12:38:14 crc kubenswrapper[4666]: E1203 12:38:14.252575 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4\": container with ID starting with 9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4 not found: ID does not exist" containerID="9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.252612 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4"} err="failed to get container status \"9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4\": rpc error: code = NotFound desc = could not find container \"9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4\": container with ID starting with 9eeffe3909e7cd34eecd71982c165cb3f3b49bad31e0e724ed2ecb66d11e40d4 not found: ID does not exist" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.252642 4666 scope.go:117] "RemoveContainer" containerID="15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db" Dec 03 12:38:14 crc kubenswrapper[4666]: E1203 12:38:14.253215 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db\": container with ID starting with 15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db not found: ID does not exist" containerID="15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.253256 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db"} err="failed to get container status \"15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db\": rpc error: code = NotFound desc = could not find container \"15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db\": container with ID starting with 15f288b64e1e1aa0f36f55848bb5ec4f560e39c6763ad203893e83682e2194db not found: ID does not exist" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.253279 4666 scope.go:117] "RemoveContainer" containerID="5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e" Dec 03 12:38:14 crc kubenswrapper[4666]: E1203 12:38:14.253551 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e\": container with ID starting with 5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e not found: ID does not exist" containerID="5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.253577 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e"} err="failed to get container status \"5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e\": rpc error: code = NotFound desc = could not find container \"5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e\": container with ID starting with 5fc52610cd7f17c5bd17f655fd16e009ce6a284ddd8d3a0b49dd6618cdcff32e not found: ID does not exist" Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.375907 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6p79"] Dec 03 12:38:14 crc kubenswrapper[4666]: I1203 12:38:14.380616 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w6p79"] Dec 03 12:38:15 crc kubenswrapper[4666]: I1203 12:38:15.042139 4666 generic.go:334] "Generic (PLEG): container finished" podID="b37520ad-5a38-43b8-ba00-071435281aa3" containerID="b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19" exitCode=0 Dec 03 12:38:15 crc kubenswrapper[4666]: I1203 12:38:15.042261 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r6ft8" event={"ID":"b37520ad-5a38-43b8-ba00-071435281aa3","Type":"ContainerDied","Data":"b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19"} Dec 03 12:38:15 crc kubenswrapper[4666]: I1203 12:38:15.042592 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r6ft8" event={"ID":"b37520ad-5a38-43b8-ba00-071435281aa3","Type":"ContainerStarted","Data":"0c92321415f77c76a0826f3cbbdedde33dcbb7615036bb7ed4742f79625a70bd"} Dec 03 12:38:15 crc kubenswrapper[4666]: I1203 12:38:15.431114 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" path="/var/lib/kubelet/pods/f9f05e87-d7ac-4f1f-816c-c0be06b66d7a/volumes" Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.060674 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" event={"ID":"b7fc3a3d-9867-4055-b071-c43574b66e7a","Type":"ContainerStarted","Data":"4fe734cd876b35e3aa0d9823834f7051484c1c392f65375f1628af6b9a25436e"} Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.061529 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.063241 4666 generic.go:334] "Generic (PLEG): container finished" podID="b37520ad-5a38-43b8-ba00-071435281aa3" containerID="34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a" exitCode=0 Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.063325 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r6ft8" event={"ID":"b37520ad-5a38-43b8-ba00-071435281aa3","Type":"ContainerDied","Data":"34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a"} Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.065619 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" event={"ID":"9dafb972-28e7-4392-9d8f-0d6036c5adab","Type":"ContainerStarted","Data":"d6108bba309d88728ab2359eed4198d601e989238e02eab509d9d9db4515e6e0"} Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.074496 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-q94td" event={"ID":"100d1193-8c3e-442e-8e9c-9983b5292555","Type":"ContainerStarted","Data":"7726535ea6151bd1c54ee9fac0bb74a10868fc52551ebd94a75ee4e3fbea2095"} Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.081863 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" podStartSLOduration=2.7597675710000003 podStartE2EDuration="6.081834074s" podCreationTimestamp="2025-12-03 12:38:11 +0000 UTC" firstStartedPulling="2025-12-03 12:38:12.615472259 +0000 UTC m=+1481.460433310" lastFinishedPulling="2025-12-03 12:38:15.937538762 +0000 UTC m=+1484.782499813" observedRunningTime="2025-12-03 12:38:17.079664545 +0000 UTC m=+1485.924625596" watchObservedRunningTime="2025-12-03 12:38:17.081834074 +0000 UTC m=+1485.926795135" Dec 03 12:38:17 crc kubenswrapper[4666]: I1203 12:38:17.100296 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-g7hwp" podStartSLOduration=2.867879909 podStartE2EDuration="6.100274072s" podCreationTimestamp="2025-12-03 12:38:11 +0000 UTC" firstStartedPulling="2025-12-03 12:38:12.698303785 +0000 UTC m=+1481.543264846" lastFinishedPulling="2025-12-03 12:38:15.930697948 +0000 UTC m=+1484.775659009" observedRunningTime="2025-12-03 12:38:17.096731856 +0000 UTC m=+1485.941692947" watchObservedRunningTime="2025-12-03 12:38:17.100274072 +0000 UTC m=+1485.945235123" Dec 03 12:38:18 crc kubenswrapper[4666]: I1203 12:38:18.084209 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r6ft8" event={"ID":"b37520ad-5a38-43b8-ba00-071435281aa3","Type":"ContainerStarted","Data":"69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf"} Dec 03 12:38:18 crc kubenswrapper[4666]: I1203 12:38:18.108432 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-q94td" podStartSLOduration=3.435379439 podStartE2EDuration="7.108410347s" podCreationTimestamp="2025-12-03 12:38:11 +0000 UTC" firstStartedPulling="2025-12-03 12:38:12.237874665 +0000 UTC m=+1481.082835716" lastFinishedPulling="2025-12-03 12:38:15.910905573 +0000 UTC m=+1484.755866624" observedRunningTime="2025-12-03 12:38:17.146775567 +0000 UTC m=+1485.991736618" watchObservedRunningTime="2025-12-03 12:38:18.108410347 +0000 UTC m=+1486.953371398" Dec 03 12:38:18 crc kubenswrapper[4666]: I1203 12:38:18.109134 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r6ft8" podStartSLOduration=2.885935738 podStartE2EDuration="5.109128216s" podCreationTimestamp="2025-12-03 12:38:13 +0000 UTC" firstStartedPulling="2025-12-03 12:38:15.23179809 +0000 UTC m=+1484.076759151" lastFinishedPulling="2025-12-03 12:38:17.454990578 +0000 UTC m=+1486.299951629" observedRunningTime="2025-12-03 12:38:18.10556182 +0000 UTC m=+1486.950522871" watchObservedRunningTime="2025-12-03 12:38:18.109128216 +0000 UTC m=+1486.954089267" Dec 03 12:38:21 crc kubenswrapper[4666]: I1203 12:38:21.988676 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-nf766" Dec 03 12:38:23 crc kubenswrapper[4666]: I1203 12:38:23.511247 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:23 crc kubenswrapper[4666]: I1203 12:38:23.511313 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:23 crc kubenswrapper[4666]: I1203 12:38:23.548383 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:24 crc kubenswrapper[4666]: I1203 12:38:24.162678 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:24 crc kubenswrapper[4666]: I1203 12:38:24.206690 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r6ft8"] Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.133037 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r6ft8" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="registry-server" containerID="cri-o://69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf" gracePeriod=2 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.917715 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mh5x5"] Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.918214 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-controller" containerID="cri-o://bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60" gracePeriod=30 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.918359 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-acl-logging" containerID="cri-o://05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a" gracePeriod=30 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.918317 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="northd" containerID="cri-o://115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11" gracePeriod=30 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.918429 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="nbdb" containerID="cri-o://f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430" gracePeriod=30 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.918503 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18" gracePeriod=30 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.918463 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-node" containerID="cri-o://ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3" gracePeriod=30 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.918979 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="sbdb" containerID="cri-o://847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b" gracePeriod=30 Dec 03 12:38:26 crc kubenswrapper[4666]: I1203 12:38:26.962729 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" containerID="cri-o://0bb44670867ae612d8f85b3373547805b8e13748b98e3083804cee3d938801c7" gracePeriod=30 Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.424775 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.457591 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4mwp\" (UniqueName: \"kubernetes.io/projected/b37520ad-5a38-43b8-ba00-071435281aa3-kube-api-access-b4mwp\") pod \"b37520ad-5a38-43b8-ba00-071435281aa3\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.457650 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-catalog-content\") pod \"b37520ad-5a38-43b8-ba00-071435281aa3\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.457802 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-utilities\") pod \"b37520ad-5a38-43b8-ba00-071435281aa3\" (UID: \"b37520ad-5a38-43b8-ba00-071435281aa3\") " Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.461294 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-utilities" (OuterVolumeSpecName: "utilities") pod "b37520ad-5a38-43b8-ba00-071435281aa3" (UID: "b37520ad-5a38-43b8-ba00-071435281aa3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.470385 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37520ad-5a38-43b8-ba00-071435281aa3-kube-api-access-b4mwp" (OuterVolumeSpecName: "kube-api-access-b4mwp") pod "b37520ad-5a38-43b8-ba00-071435281aa3" (UID: "b37520ad-5a38-43b8-ba00-071435281aa3"). InnerVolumeSpecName "kube-api-access-b4mwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.514725 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b37520ad-5a38-43b8-ba00-071435281aa3" (UID: "b37520ad-5a38-43b8-ba00-071435281aa3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.560115 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.560168 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4mwp\" (UniqueName: \"kubernetes.io/projected/b37520ad-5a38-43b8-ba00-071435281aa3-kube-api-access-b4mwp\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:27 crc kubenswrapper[4666]: I1203 12:38:27.560180 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37520ad-5a38-43b8-ba00-071435281aa3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.150171 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/2.log" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.150812 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/1.log" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.150869 4666 generic.go:334] "Generic (PLEG): container finished" podID="ba134276-4c96-4ba6-b18f-276b312a7355" containerID="fcfa2f98a9da4e4ba0160ca2261d53489e3922f93e699e769d5c745afc146156" exitCode=2 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.150915 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerDied","Data":"fcfa2f98a9da4e4ba0160ca2261d53489e3922f93e699e769d5c745afc146156"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.151008 4666 scope.go:117] "RemoveContainer" containerID="8624c72cacbdd058470173fc3d7659d5db4c7731c7c1be495b871595058347bd" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.153617 4666 scope.go:117] "RemoveContainer" containerID="fcfa2f98a9da4e4ba0160ca2261d53489e3922f93e699e769d5c745afc146156" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.158873 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovnkube-controller/3.log" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.163318 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovn-acl-logging/0.log" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.164189 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovn-controller/0.log" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168596 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="0bb44670867ae612d8f85b3373547805b8e13748b98e3083804cee3d938801c7" exitCode=0 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168653 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b" exitCode=0 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168668 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430" exitCode=0 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168685 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11" exitCode=0 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168698 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18" exitCode=0 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168710 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3" exitCode=0 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168722 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a" exitCode=143 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168735 4666 generic.go:334] "Generic (PLEG): container finished" podID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerID="bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60" exitCode=143 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168718 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"0bb44670867ae612d8f85b3373547805b8e13748b98e3083804cee3d938801c7"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168813 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168901 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168919 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168947 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168968 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.168988 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.169008 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.177895 4666 generic.go:334] "Generic (PLEG): container finished" podID="b37520ad-5a38-43b8-ba00-071435281aa3" containerID="69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf" exitCode=0 Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.177953 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r6ft8" event={"ID":"b37520ad-5a38-43b8-ba00-071435281aa3","Type":"ContainerDied","Data":"69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.177990 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r6ft8" event={"ID":"b37520ad-5a38-43b8-ba00-071435281aa3","Type":"ContainerDied","Data":"0c92321415f77c76a0826f3cbbdedde33dcbb7615036bb7ed4742f79625a70bd"} Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.178063 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r6ft8" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.212163 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r6ft8"] Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.224162 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r6ft8"] Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.521394 4666 scope.go:117] "RemoveContainer" containerID="801ac8944e252a3242e0fd7d3939089618af32fbb2b95be69337d5acdfe49b26" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.526793 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovn-acl-logging/0.log" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.528588 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovn-controller/0.log" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.529255 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.544796 4666 scope.go:117] "RemoveContainer" containerID="69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.571397 4666 scope.go:117] "RemoveContainer" containerID="34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574338 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-env-overrides\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574379 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-bin\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574403 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-ovn\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574420 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-node-log\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574437 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-kubelet\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574459 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-config\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574474 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574491 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-log-socket\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574505 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-systemd\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574522 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-openvswitch\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574568 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovn-node-metrics-cert\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574602 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-netns\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574668 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-etc-openvswitch\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574716 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whx68\" (UniqueName: \"kubernetes.io/projected/6fce11cd-ec4a-4e25-9483-21a8a45f332c-kube-api-access-whx68\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574741 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-slash\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574852 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-netd\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574872 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-var-lib-openvswitch\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574890 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-systemd-units\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574911 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-ovn-kubernetes\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.574928 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-script-lib\") pod \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\" (UID: \"6fce11cd-ec4a-4e25-9483-21a8a45f332c\") " Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.575653 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576040 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576096 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576123 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576142 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-node-log" (OuterVolumeSpecName: "node-log") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576160 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576453 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576478 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.576498 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-log-socket" (OuterVolumeSpecName: "log-socket") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.577451 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.577613 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-slash" (OuterVolumeSpecName: "host-slash") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.577649 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.577700 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.578154 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.578257 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.578250 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.578285 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.587253 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.590297 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fce11cd-ec4a-4e25-9483-21a8a45f332c-kube-api-access-whx68" (OuterVolumeSpecName: "kube-api-access-whx68") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "kube-api-access-whx68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.594312 4666 scope.go:117] "RemoveContainer" containerID="b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.595657 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6fce11cd-ec4a-4e25-9483-21a8a45f332c" (UID: "6fce11cd-ec4a-4e25-9483-21a8a45f332c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.603588 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mwvvc"] Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604448 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="sbdb" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604478 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="sbdb" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604496 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="registry-server" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604514 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="registry-server" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604526 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-acl-logging" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604534 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-acl-logging" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604546 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="extract-utilities" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604554 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="extract-utilities" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604566 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604574 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604590 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="extract-content" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604598 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="extract-content" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604607 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604615 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604623 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604630 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604641 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604648 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604658 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="northd" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604667 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="northd" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604676 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="registry-server" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604685 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="registry-server" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604693 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604702 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604712 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="nbdb" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604721 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="nbdb" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604736 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604744 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604752 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="extract-content" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604759 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="extract-content" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604770 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kubecfg-setup" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604777 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kubecfg-setup" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604787 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604821 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604831 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="extract-utilities" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604841 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="extract-utilities" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.604850 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-node" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604858 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-node" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604984 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.604995 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="sbdb" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605013 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605025 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605037 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" containerName="registry-server" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605050 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f05e87-d7ac-4f1f-816c-c0be06b66d7a" containerName="registry-server" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605060 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605071 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-acl-logging" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605080 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="nbdb" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605346 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605357 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovn-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605373 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="kube-rbac-proxy-node" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605384 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="northd" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.605619 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" containerName="ovnkube-controller" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.617385 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.627664 4666 scope.go:117] "RemoveContainer" containerID="69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.628481 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf\": container with ID starting with 69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf not found: ID does not exist" containerID="69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.628535 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf"} err="failed to get container status \"69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf\": rpc error: code = NotFound desc = could not find container \"69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf\": container with ID starting with 69ccd28c09f9ef0f65785b3be0d0590d18ece73104bbe1b780301ebbd23575cf not found: ID does not exist" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.628567 4666 scope.go:117] "RemoveContainer" containerID="34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.629438 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a\": container with ID starting with 34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a not found: ID does not exist" containerID="34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.629469 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a"} err="failed to get container status \"34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a\": rpc error: code = NotFound desc = could not find container \"34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a\": container with ID starting with 34b8dfbb70db44da2f3891c2f4d11c23b56fcc8bcc76b93dd42e8b7ca1cedf8a not found: ID does not exist" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.629486 4666 scope.go:117] "RemoveContainer" containerID="b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19" Dec 03 12:38:28 crc kubenswrapper[4666]: E1203 12:38:28.629855 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19\": container with ID starting with b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19 not found: ID does not exist" containerID="b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.629921 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19"} err="failed to get container status \"b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19\": rpc error: code = NotFound desc = could not find container \"b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19\": container with ID starting with b3b34a2b78e71c42e883b25376d39b7e5c27ac4953c1f9e28f132dedfa4a5a19 not found: ID does not exist" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.678361 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovnkube-script-lib\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.678415 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-kubelet\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.678458 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.678647 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.678685 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovn-node-metrics-cert\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679052 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-var-lib-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679108 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-cni-bin\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679139 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-env-overrides\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679161 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-etc-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679183 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovnkube-config\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679211 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwrtf\" (UniqueName: \"kubernetes.io/projected/d56ece31-a98c-4903-8cb8-37f2aec6abcd-kube-api-access-lwrtf\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679236 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-ovn\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679257 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-slash\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679282 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-run-netns\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679313 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679334 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-log-socket\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679351 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-systemd\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679373 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-cni-netd\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679394 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-systemd-units\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679413 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-node-log\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679461 4666 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679473 4666 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679484 4666 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679497 4666 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679506 4666 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679515 4666 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679525 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whx68\" (UniqueName: \"kubernetes.io/projected/6fce11cd-ec4a-4e25-9483-21a8a45f332c-kube-api-access-whx68\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679536 4666 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679548 4666 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679561 4666 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679575 4666 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679584 4666 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679594 4666 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679603 4666 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679611 4666 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679620 4666 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679630 4666 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679643 4666 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679655 4666 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fce11cd-ec4a-4e25-9483-21a8a45f332c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.679666 4666 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6fce11cd-ec4a-4e25-9483-21a8a45f332c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780598 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-env-overrides\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780648 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-etc-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780666 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovnkube-config\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780682 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwrtf\" (UniqueName: \"kubernetes.io/projected/d56ece31-a98c-4903-8cb8-37f2aec6abcd-kube-api-access-lwrtf\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780706 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-ovn\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780727 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-slash\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780747 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-run-netns\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780772 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-log-socket\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780789 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780808 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-systemd\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780826 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-cni-netd\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780846 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-systemd-units\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780864 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-node-log\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780919 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovnkube-script-lib\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780937 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-kubelet\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780962 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.780982 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781005 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovn-node-metrics-cert\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781022 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-var-lib-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781038 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-cni-bin\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781126 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-cni-bin\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781167 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-cni-netd\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781192 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-systemd-units\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781201 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-systemd\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781220 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-node-log\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781254 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-slash\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781294 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781327 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-ovn\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781429 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-run-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781435 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-etc-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781469 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-var-lib-openvswitch\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781531 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-run-netns\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781634 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781701 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-host-kubelet\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.781708 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56ece31-a98c-4903-8cb8-37f2aec6abcd-log-socket\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.782026 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-env-overrides\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.782210 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovnkube-config\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.782355 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovnkube-script-lib\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.786676 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56ece31-a98c-4903-8cb8-37f2aec6abcd-ovn-node-metrics-cert\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.801149 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwrtf\" (UniqueName: \"kubernetes.io/projected/d56ece31-a98c-4903-8cb8-37f2aec6abcd-kube-api-access-lwrtf\") pod \"ovnkube-node-mwvvc\" (UID: \"d56ece31-a98c-4903-8cb8-37f2aec6abcd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: I1203 12:38:28.932593 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:28 crc kubenswrapper[4666]: W1203 12:38:28.965714 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56ece31_a98c_4903_8cb8_37f2aec6abcd.slice/crio-da71fc9c102908d97539b72a2acb5e996ead04d4e5bcb8bb8cee0600764b3c29 WatchSource:0}: Error finding container da71fc9c102908d97539b72a2acb5e996ead04d4e5bcb8bb8cee0600764b3c29: Status 404 returned error can't find the container with id da71fc9c102908d97539b72a2acb5e996ead04d4e5bcb8bb8cee0600764b3c29 Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.189478 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovn-acl-logging/0.log" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.190035 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mh5x5_6fce11cd-ec4a-4e25-9483-21a8a45f332c/ovn-controller/0.log" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.190804 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" event={"ID":"6fce11cd-ec4a-4e25-9483-21a8a45f332c","Type":"ContainerDied","Data":"1fb6d1da6897c031ffd99693a1ec02f1feb37712a63fa0cdf6e4e412874621fd"} Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.190881 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mh5x5" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.190896 4666 scope.go:117] "RemoveContainer" containerID="0bb44670867ae612d8f85b3373547805b8e13748b98e3083804cee3d938801c7" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.194458 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wbdks_ba134276-4c96-4ba6-b18f-276b312a7355/kube-multus/2.log" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.194553 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wbdks" event={"ID":"ba134276-4c96-4ba6-b18f-276b312a7355","Type":"ContainerStarted","Data":"dbf2630d20cca745aeda6bbd0188c1680cea3eb02a244188c97db6105c582971"} Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.196707 4666 generic.go:334] "Generic (PLEG): container finished" podID="d56ece31-a98c-4903-8cb8-37f2aec6abcd" containerID="b4ebc2b4cd946b9a39e13a694c84672351059b02f6d59a912768be2791d6161e" exitCode=0 Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.196783 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerDied","Data":"b4ebc2b4cd946b9a39e13a694c84672351059b02f6d59a912768be2791d6161e"} Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.196828 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"da71fc9c102908d97539b72a2acb5e996ead04d4e5bcb8bb8cee0600764b3c29"} Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.214697 4666 scope.go:117] "RemoveContainer" containerID="847d431c8cd31185c119dc306d294ddb4bf2713e8affd32b8525c05f350d517b" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.272310 4666 scope.go:117] "RemoveContainer" containerID="f071d7b426ed7739568b9c7911e080bbc0ab80743da0a79eb125ab3af3b2f430" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.327456 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mh5x5"] Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.330854 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mh5x5"] Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.345844 4666 scope.go:117] "RemoveContainer" containerID="115f01bf70a512c3b70046a4342a65af4fd980c3563bba4e8b5f4affb8158a11" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.362222 4666 scope.go:117] "RemoveContainer" containerID="7d0cd31b258b31c8939d7d4f1fa98c5f08e4fe7fab1ed33d99afbf67832c0a18" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.377532 4666 scope.go:117] "RemoveContainer" containerID="ad4b2ee5cd4c25d585ed8e8868cca053f3cf68081b5631cd6f63535e97e6deb3" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.393264 4666 scope.go:117] "RemoveContainer" containerID="05ac7a03a85c270d7a6fe627ed08c3745cd0d5de8463cfb643b80c74e999551a" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.420019 4666 scope.go:117] "RemoveContainer" containerID="bd49f422c38cf0dbae75ea5164f25f80a9bab44bcefee94d23f1172ec79b6d60" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.431042 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fce11cd-ec4a-4e25-9483-21a8a45f332c" path="/var/lib/kubelet/pods/6fce11cd-ec4a-4e25-9483-21a8a45f332c/volumes" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.432851 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37520ad-5a38-43b8-ba00-071435281aa3" path="/var/lib/kubelet/pods/b37520ad-5a38-43b8-ba00-071435281aa3/volumes" Dec 03 12:38:29 crc kubenswrapper[4666]: I1203 12:38:29.438357 4666 scope.go:117] "RemoveContainer" containerID="3fc5e960bd0df37fab8f9c15ba225ee09aa78b63dfd838cbce91c03f7ee063c3" Dec 03 12:38:30 crc kubenswrapper[4666]: I1203 12:38:30.208177 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"e589056ff0afba6892ce7113e14f6c0cce499e172de21dabe2e1b073732103fb"} Dec 03 12:38:30 crc kubenswrapper[4666]: I1203 12:38:30.208240 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"8d84e53190b59129e7023376623fd0ffd91718444c94742c5a8ed961d776370f"} Dec 03 12:38:30 crc kubenswrapper[4666]: I1203 12:38:30.208258 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"c8462699c14672b2a869ae9b25c60ecb31866baaadb44a38805629ae6bdb7e40"} Dec 03 12:38:30 crc kubenswrapper[4666]: I1203 12:38:30.208273 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"20a68e4a5cb73bd6b1f791a7a59f6210aa202dc404dff129a7c8ec28e4265674"} Dec 03 12:38:30 crc kubenswrapper[4666]: I1203 12:38:30.208288 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"fa2d7af4215d7661929a2ca34b3906cc733522150179fbd12c106107c52829c0"} Dec 03 12:38:31 crc kubenswrapper[4666]: I1203 12:38:31.220634 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"f991a71c453532b5938d13455dd57bbe9850fdadb81500f97a0b5c7d5cf4f94c"} Dec 03 12:38:33 crc kubenswrapper[4666]: I1203 12:38:33.235274 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"b1bcf2bcc8cfb3bc7242362d04557ef267a22fde074339accb7c096a6ef9c633"} Dec 03 12:38:36 crc kubenswrapper[4666]: I1203 12:38:36.261962 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" event={"ID":"d56ece31-a98c-4903-8cb8-37f2aec6abcd","Type":"ContainerStarted","Data":"ad2175ed2ce899bc19e1286fcdaa6bf7ebccc9a884b17c904b4b0ac98b0487ff"} Dec 03 12:38:36 crc kubenswrapper[4666]: I1203 12:38:36.262741 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:36 crc kubenswrapper[4666]: I1203 12:38:36.292520 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:36 crc kubenswrapper[4666]: I1203 12:38:36.300800 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" podStartSLOduration=8.300773241 podStartE2EDuration="8.300773241s" podCreationTimestamp="2025-12-03 12:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:38:36.299292891 +0000 UTC m=+1505.144253952" watchObservedRunningTime="2025-12-03 12:38:36.300773241 +0000 UTC m=+1505.145734292" Dec 03 12:38:37 crc kubenswrapper[4666]: I1203 12:38:37.268033 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:37 crc kubenswrapper[4666]: I1203 12:38:37.268117 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:37 crc kubenswrapper[4666]: I1203 12:38:37.295223 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:38:58 crc kubenswrapper[4666]: I1203 12:38:58.966174 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwvvc" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.671576 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l"] Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.673491 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.675738 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.685224 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l"] Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.805384 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9gdq\" (UniqueName: \"kubernetes.io/projected/49e29685-fddc-4db1-acbf-07d806a3280e-kube-api-access-l9gdq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.805489 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.805544 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.906750 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.906968 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9gdq\" (UniqueName: \"kubernetes.io/projected/49e29685-fddc-4db1-acbf-07d806a3280e-kube-api-access-l9gdq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.907149 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.907545 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.907676 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.931298 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9gdq\" (UniqueName: \"kubernetes.io/projected/49e29685-fddc-4db1-acbf-07d806a3280e-kube-api-access-l9gdq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:03 crc kubenswrapper[4666]: I1203 12:39:03.991462 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:04 crc kubenswrapper[4666]: I1203 12:39:04.253654 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l"] Dec 03 12:39:04 crc kubenswrapper[4666]: I1203 12:39:04.489374 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" event={"ID":"49e29685-fddc-4db1-acbf-07d806a3280e","Type":"ContainerStarted","Data":"022b6d7acef3040a9bf7f3813cefc6f96d0628c4cf25636a852f928635097985"} Dec 03 12:39:04 crc kubenswrapper[4666]: I1203 12:39:04.489444 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" event={"ID":"49e29685-fddc-4db1-acbf-07d806a3280e","Type":"ContainerStarted","Data":"74e0370293125e1062b52c737fa717e24bbbbdcf4d536e84df7496107e2d12c9"} Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.497623 4666 generic.go:334] "Generic (PLEG): container finished" podID="49e29685-fddc-4db1-acbf-07d806a3280e" containerID="022b6d7acef3040a9bf7f3813cefc6f96d0628c4cf25636a852f928635097985" exitCode=0 Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.497743 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" event={"ID":"49e29685-fddc-4db1-acbf-07d806a3280e","Type":"ContainerDied","Data":"022b6d7acef3040a9bf7f3813cefc6f96d0628c4cf25636a852f928635097985"} Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.859342 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sh9p4"] Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.860768 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.876435 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh9p4"] Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.940099 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-utilities\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.940164 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-catalog-content\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:05 crc kubenswrapper[4666]: I1203 12:39:05.940193 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r2md\" (UniqueName: \"kubernetes.io/projected/ba87c49d-1fc8-482e-89d7-61493fb04a13-kube-api-access-4r2md\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.041952 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-utilities\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.042013 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-catalog-content\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.042039 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r2md\" (UniqueName: \"kubernetes.io/projected/ba87c49d-1fc8-482e-89d7-61493fb04a13-kube-api-access-4r2md\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.042672 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-utilities\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.042908 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-catalog-content\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.077895 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r2md\" (UniqueName: \"kubernetes.io/projected/ba87c49d-1fc8-482e-89d7-61493fb04a13-kube-api-access-4r2md\") pod \"redhat-operators-sh9p4\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.218046 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.458410 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh9p4"] Dec 03 12:39:06 crc kubenswrapper[4666]: W1203 12:39:06.470184 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba87c49d_1fc8_482e_89d7_61493fb04a13.slice/crio-ac892d429aa0d91e47dfdd0a428b25b850a859893f3adc44590cc5f8c3633db5 WatchSource:0}: Error finding container ac892d429aa0d91e47dfdd0a428b25b850a859893f3adc44590cc5f8c3633db5: Status 404 returned error can't find the container with id ac892d429aa0d91e47dfdd0a428b25b850a859893f3adc44590cc5f8c3633db5 Dec 03 12:39:06 crc kubenswrapper[4666]: I1203 12:39:06.506038 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh9p4" event={"ID":"ba87c49d-1fc8-482e-89d7-61493fb04a13","Type":"ContainerStarted","Data":"ac892d429aa0d91e47dfdd0a428b25b850a859893f3adc44590cc5f8c3633db5"} Dec 03 12:39:07 crc kubenswrapper[4666]: I1203 12:39:07.515350 4666 generic.go:334] "Generic (PLEG): container finished" podID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerID="48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db" exitCode=0 Dec 03 12:39:07 crc kubenswrapper[4666]: I1203 12:39:07.515405 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh9p4" event={"ID":"ba87c49d-1fc8-482e-89d7-61493fb04a13","Type":"ContainerDied","Data":"48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db"} Dec 03 12:39:07 crc kubenswrapper[4666]: I1203 12:39:07.519739 4666 generic.go:334] "Generic (PLEG): container finished" podID="49e29685-fddc-4db1-acbf-07d806a3280e" containerID="3a4d22406459f4c75621e7490182ea5a0d0cffa8130f7bde0815b83fa75e130e" exitCode=0 Dec 03 12:39:07 crc kubenswrapper[4666]: I1203 12:39:07.519798 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" event={"ID":"49e29685-fddc-4db1-acbf-07d806a3280e","Type":"ContainerDied","Data":"3a4d22406459f4c75621e7490182ea5a0d0cffa8130f7bde0815b83fa75e130e"} Dec 03 12:39:08 crc kubenswrapper[4666]: I1203 12:39:08.529725 4666 generic.go:334] "Generic (PLEG): container finished" podID="49e29685-fddc-4db1-acbf-07d806a3280e" containerID="c4fb907ada78214d2de5dd613deae8ce1aba945ca0a07f7c00d2b23901ea47e2" exitCode=0 Dec 03 12:39:08 crc kubenswrapper[4666]: I1203 12:39:08.529817 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" event={"ID":"49e29685-fddc-4db1-acbf-07d806a3280e","Type":"ContainerDied","Data":"c4fb907ada78214d2de5dd613deae8ce1aba945ca0a07f7c00d2b23901ea47e2"} Dec 03 12:39:08 crc kubenswrapper[4666]: I1203 12:39:08.534006 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh9p4" event={"ID":"ba87c49d-1fc8-482e-89d7-61493fb04a13","Type":"ContainerStarted","Data":"534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6"} Dec 03 12:39:09 crc kubenswrapper[4666]: I1203 12:39:09.546692 4666 generic.go:334] "Generic (PLEG): container finished" podID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerID="534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6" exitCode=0 Dec 03 12:39:09 crc kubenswrapper[4666]: I1203 12:39:09.546769 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh9p4" event={"ID":"ba87c49d-1fc8-482e-89d7-61493fb04a13","Type":"ContainerDied","Data":"534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6"} Dec 03 12:39:09 crc kubenswrapper[4666]: I1203 12:39:09.846637 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.007697 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-util\") pod \"49e29685-fddc-4db1-acbf-07d806a3280e\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.007772 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-bundle\") pod \"49e29685-fddc-4db1-acbf-07d806a3280e\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.007815 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9gdq\" (UniqueName: \"kubernetes.io/projected/49e29685-fddc-4db1-acbf-07d806a3280e-kube-api-access-l9gdq\") pod \"49e29685-fddc-4db1-acbf-07d806a3280e\" (UID: \"49e29685-fddc-4db1-acbf-07d806a3280e\") " Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.008816 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-bundle" (OuterVolumeSpecName: "bundle") pod "49e29685-fddc-4db1-acbf-07d806a3280e" (UID: "49e29685-fddc-4db1-acbf-07d806a3280e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.016360 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e29685-fddc-4db1-acbf-07d806a3280e-kube-api-access-l9gdq" (OuterVolumeSpecName: "kube-api-access-l9gdq") pod "49e29685-fddc-4db1-acbf-07d806a3280e" (UID: "49e29685-fddc-4db1-acbf-07d806a3280e"). InnerVolumeSpecName "kube-api-access-l9gdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.024796 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-util" (OuterVolumeSpecName: "util") pod "49e29685-fddc-4db1-acbf-07d806a3280e" (UID: "49e29685-fddc-4db1-acbf-07d806a3280e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.109407 4666 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.109467 4666 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49e29685-fddc-4db1-acbf-07d806a3280e-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.109490 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9gdq\" (UniqueName: \"kubernetes.io/projected/49e29685-fddc-4db1-acbf-07d806a3280e-kube-api-access-l9gdq\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.558611 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh9p4" event={"ID":"ba87c49d-1fc8-482e-89d7-61493fb04a13","Type":"ContainerStarted","Data":"7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686"} Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.562542 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" event={"ID":"49e29685-fddc-4db1-acbf-07d806a3280e","Type":"ContainerDied","Data":"74e0370293125e1062b52c737fa717e24bbbbdcf4d536e84df7496107e2d12c9"} Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.562575 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.562604 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74e0370293125e1062b52c737fa717e24bbbbdcf4d536e84df7496107e2d12c9" Dec 03 12:39:10 crc kubenswrapper[4666]: I1203 12:39:10.579944 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sh9p4" podStartSLOduration=2.8005784240000002 podStartE2EDuration="5.579910225s" podCreationTimestamp="2025-12-03 12:39:05 +0000 UTC" firstStartedPulling="2025-12-03 12:39:07.517945624 +0000 UTC m=+1536.362906675" lastFinishedPulling="2025-12-03 12:39:10.297277415 +0000 UTC m=+1539.142238476" observedRunningTime="2025-12-03 12:39:10.579029011 +0000 UTC m=+1539.423990102" watchObservedRunningTime="2025-12-03 12:39:10.579910225 +0000 UTC m=+1539.424871296" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.054422 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn"] Dec 03 12:39:15 crc kubenswrapper[4666]: E1203 12:39:15.055052 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e29685-fddc-4db1-acbf-07d806a3280e" containerName="pull" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.055066 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e29685-fddc-4db1-acbf-07d806a3280e" containerName="pull" Dec 03 12:39:15 crc kubenswrapper[4666]: E1203 12:39:15.055109 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e29685-fddc-4db1-acbf-07d806a3280e" containerName="util" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.055116 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e29685-fddc-4db1-acbf-07d806a3280e" containerName="util" Dec 03 12:39:15 crc kubenswrapper[4666]: E1203 12:39:15.055124 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e29685-fddc-4db1-acbf-07d806a3280e" containerName="extract" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.055131 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e29685-fddc-4db1-acbf-07d806a3280e" containerName="extract" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.055758 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e29685-fddc-4db1-acbf-07d806a3280e" containerName="extract" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.056234 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.058305 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.058486 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-f2ddp" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.060052 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.079560 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn"] Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.187678 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lcds\" (UniqueName: \"kubernetes.io/projected/c28a05b1-6eb0-43a1-a581-c8a5f3b956b6-kube-api-access-6lcds\") pod \"nmstate-operator-5b5b58f5c8-wqnxn\" (UID: \"c28a05b1-6eb0-43a1-a581-c8a5f3b956b6\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.289271 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lcds\" (UniqueName: \"kubernetes.io/projected/c28a05b1-6eb0-43a1-a581-c8a5f3b956b6-kube-api-access-6lcds\") pod \"nmstate-operator-5b5b58f5c8-wqnxn\" (UID: \"c28a05b1-6eb0-43a1-a581-c8a5f3b956b6\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.319124 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lcds\" (UniqueName: \"kubernetes.io/projected/c28a05b1-6eb0-43a1-a581-c8a5f3b956b6-kube-api-access-6lcds\") pod \"nmstate-operator-5b5b58f5c8-wqnxn\" (UID: \"c28a05b1-6eb0-43a1-a581-c8a5f3b956b6\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.373784 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" Dec 03 12:39:15 crc kubenswrapper[4666]: I1203 12:39:15.888711 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn"] Dec 03 12:39:16 crc kubenswrapper[4666]: I1203 12:39:16.219170 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:16 crc kubenswrapper[4666]: I1203 12:39:16.219714 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:16 crc kubenswrapper[4666]: I1203 12:39:16.610170 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" event={"ID":"c28a05b1-6eb0-43a1-a581-c8a5f3b956b6","Type":"ContainerStarted","Data":"5ce1de1249d4a8867e970262f77ac55a6c48164a9532091a705edd643cb4e5f4"} Dec 03 12:39:17 crc kubenswrapper[4666]: I1203 12:39:17.260871 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sh9p4" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:17 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:17 crc kubenswrapper[4666]: > Dec 03 12:39:19 crc kubenswrapper[4666]: I1203 12:39:19.631325 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" event={"ID":"c28a05b1-6eb0-43a1-a581-c8a5f3b956b6","Type":"ContainerStarted","Data":"cd2547f586e256c2df3903ccd1848b406991edd1f3d95fea331dec9a8f869055"} Dec 03 12:39:19 crc kubenswrapper[4666]: I1203 12:39:19.660741 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wqnxn" podStartSLOduration=1.347118857 podStartE2EDuration="4.660704091s" podCreationTimestamp="2025-12-03 12:39:15 +0000 UTC" firstStartedPulling="2025-12-03 12:39:15.901146827 +0000 UTC m=+1544.746107878" lastFinishedPulling="2025-12-03 12:39:19.214732061 +0000 UTC m=+1548.059693112" observedRunningTime="2025-12-03 12:39:19.652294614 +0000 UTC m=+1548.497255715" watchObservedRunningTime="2025-12-03 12:39:19.660704091 +0000 UTC m=+1548.505665172" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.150451 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.152501 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.157770 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xtqgx" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.165322 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.191013 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.191925 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.193929 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.213378 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7x7ll"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.214267 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.226496 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.300804 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9t7\" (UniqueName: \"kubernetes.io/projected/65d7f250-10bf-4a17-879f-856d2ea16b91-kube-api-access-qn9t7\") pod \"nmstate-metrics-7f946cbc9-5gp27\" (UID: \"65d7f250-10bf-4a17-879f-856d2ea16b91\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.402365 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9t7\" (UniqueName: \"kubernetes.io/projected/65d7f250-10bf-4a17-879f-856d2ea16b91-kube-api-access-qn9t7\") pod \"nmstate-metrics-7f946cbc9-5gp27\" (UID: \"65d7f250-10bf-4a17-879f-856d2ea16b91\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.402500 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl55d\" (UniqueName: \"kubernetes.io/projected/f43cdd10-999d-470a-89d0-909660ec7e67-kube-api-access-pl55d\") pod \"nmstate-webhook-5f6d4c5ccb-vvj59\" (UID: \"f43cdd10-999d-470a-89d0-909660ec7e67\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.402568 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkg4w\" (UniqueName: \"kubernetes.io/projected/02c230c0-43ab-4476-b3dd-64cb686195c0-kube-api-access-bkg4w\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.402614 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-dbus-socket\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.402634 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f43cdd10-999d-470a-89d0-909660ec7e67-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vvj59\" (UID: \"f43cdd10-999d-470a-89d0-909660ec7e67\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.402655 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-ovs-socket\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.402702 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-nmstate-lock\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.419003 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.419970 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.424360 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-p9nnn" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.424370 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.424468 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.430980 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9t7\" (UniqueName: \"kubernetes.io/projected/65d7f250-10bf-4a17-879f-856d2ea16b91-kube-api-access-qn9t7\") pod \"nmstate-metrics-7f946cbc9-5gp27\" (UID: \"65d7f250-10bf-4a17-879f-856d2ea16b91\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.443069 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.475631 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.516078 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl55d\" (UniqueName: \"kubernetes.io/projected/f43cdd10-999d-470a-89d0-909660ec7e67-kube-api-access-pl55d\") pod \"nmstate-webhook-5f6d4c5ccb-vvj59\" (UID: \"f43cdd10-999d-470a-89d0-909660ec7e67\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.516669 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkg4w\" (UniqueName: \"kubernetes.io/projected/02c230c0-43ab-4476-b3dd-64cb686195c0-kube-api-access-bkg4w\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.516707 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-dbus-socket\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.516750 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f43cdd10-999d-470a-89d0-909660ec7e67-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vvj59\" (UID: \"f43cdd10-999d-470a-89d0-909660ec7e67\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.516781 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-ovs-socket\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.516811 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-nmstate-lock\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.516920 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-nmstate-lock\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.526008 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-dbus-socket\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.526889 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02c230c0-43ab-4476-b3dd-64cb686195c0-ovs-socket\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.536341 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f43cdd10-999d-470a-89d0-909660ec7e67-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vvj59\" (UID: \"f43cdd10-999d-470a-89d0-909660ec7e67\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.553749 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkg4w\" (UniqueName: \"kubernetes.io/projected/02c230c0-43ab-4476-b3dd-64cb686195c0-kube-api-access-bkg4w\") pod \"nmstate-handler-7x7ll\" (UID: \"02c230c0-43ab-4476-b3dd-64cb686195c0\") " pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.565803 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl55d\" (UniqueName: \"kubernetes.io/projected/f43cdd10-999d-470a-89d0-909660ec7e67-kube-api-access-pl55d\") pod \"nmstate-webhook-5f6d4c5ccb-vvj59\" (UID: \"f43cdd10-999d-470a-89d0-909660ec7e67\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.607610 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-96d9dfbb5-rpg8d"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.608445 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.617930 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8f7dae47-5e1c-4945-9827-33a00c4c0d66-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.617971 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7dae47-5e1c-4945-9827-33a00c4c0d66-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.617996 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4nn\" (UniqueName: \"kubernetes.io/projected/8f7dae47-5e1c-4945-9827-33a00c4c0d66-kube-api-access-pn4nn\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.636998 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-96d9dfbb5-rpg8d"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720456 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-service-ca\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720553 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-oauth-serving-cert\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720638 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb3fda58-5638-4c98-b9e5-597bd7213048-console-oauth-config\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720688 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-trusted-ca-bundle\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720723 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-console-config\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720778 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8f7dae47-5e1c-4945-9827-33a00c4c0d66-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720800 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7dae47-5e1c-4945-9827-33a00c4c0d66-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720839 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3fda58-5638-4c98-b9e5-597bd7213048-console-serving-cert\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.720869 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn4nn\" (UniqueName: \"kubernetes.io/projected/8f7dae47-5e1c-4945-9827-33a00c4c0d66-kube-api-access-pn4nn\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.721043 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9hn\" (UniqueName: \"kubernetes.io/projected/fb3fda58-5638-4c98-b9e5-597bd7213048-kube-api-access-sp9hn\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.722313 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8f7dae47-5e1c-4945-9827-33a00c4c0d66-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.730350 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7dae47-5e1c-4945-9827-33a00c4c0d66-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.749553 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn4nn\" (UniqueName: \"kubernetes.io/projected/8f7dae47-5e1c-4945-9827-33a00c4c0d66-kube-api-access-pn4nn\") pod \"nmstate-console-plugin-7fbb5f6569-m4v87\" (UID: \"8f7dae47-5e1c-4945-9827-33a00c4c0d66\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.769592 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.771124 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27"] Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.809179 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.822108 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb3fda58-5638-4c98-b9e5-597bd7213048-console-oauth-config\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.822149 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-trusted-ca-bundle\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.822198 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-console-config\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.822219 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3fda58-5638-4c98-b9e5-597bd7213048-console-serving-cert\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.822254 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9hn\" (UniqueName: \"kubernetes.io/projected/fb3fda58-5638-4c98-b9e5-597bd7213048-kube-api-access-sp9hn\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.822281 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-service-ca\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.822315 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-oauth-serving-cert\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.823624 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-oauth-serving-cert\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.823824 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-service-ca\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.824178 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-trusted-ca-bundle\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.824290 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb3fda58-5638-4c98-b9e5-597bd7213048-console-config\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.826458 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb3fda58-5638-4c98-b9e5-597bd7213048-console-oauth-config\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.826652 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3fda58-5638-4c98-b9e5-597bd7213048-console-serving-cert\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.832692 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.841071 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9hn\" (UniqueName: \"kubernetes.io/projected/fb3fda58-5638-4c98-b9e5-597bd7213048-kube-api-access-sp9hn\") pod \"console-96d9dfbb5-rpg8d\" (UID: \"fb3fda58-5638-4c98-b9e5-597bd7213048\") " pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:24 crc kubenswrapper[4666]: W1203 12:39:24.859497 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c230c0_43ab_4476_b3dd_64cb686195c0.slice/crio-7f6083eeb34ab526e9870c9aecd076d6e165e9ed7443b8138f5d621d4d4742ac WatchSource:0}: Error finding container 7f6083eeb34ab526e9870c9aecd076d6e165e9ed7443b8138f5d621d4d4742ac: Status 404 returned error can't find the container with id 7f6083eeb34ab526e9870c9aecd076d6e165e9ed7443b8138f5d621d4d4742ac Dec 03 12:39:24 crc kubenswrapper[4666]: I1203 12:39:24.931479 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.018778 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87"] Dec 03 12:39:25 crc kubenswrapper[4666]: W1203 12:39:25.026284 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f7dae47_5e1c_4945_9827_33a00c4c0d66.slice/crio-ab42c22781a8de73e5977f9ad1ebcbde5871741818a83ecf9d2846aa7b18ae78 WatchSource:0}: Error finding container ab42c22781a8de73e5977f9ad1ebcbde5871741818a83ecf9d2846aa7b18ae78: Status 404 returned error can't find the container with id ab42c22781a8de73e5977f9ad1ebcbde5871741818a83ecf9d2846aa7b18ae78 Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.051356 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59"] Dec 03 12:39:25 crc kubenswrapper[4666]: W1203 12:39:25.084293 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43cdd10_999d_470a_89d0_909660ec7e67.slice/crio-d0f3a1eed1c0e98f06f48509c97d03a24552c60e323d7a66e976d4fc4f927502 WatchSource:0}: Error finding container d0f3a1eed1c0e98f06f48509c97d03a24552c60e323d7a66e976d4fc4f927502: Status 404 returned error can't find the container with id d0f3a1eed1c0e98f06f48509c97d03a24552c60e323d7a66e976d4fc4f927502 Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.187486 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-96d9dfbb5-rpg8d"] Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.683782 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" event={"ID":"f43cdd10-999d-470a-89d0-909660ec7e67","Type":"ContainerStarted","Data":"d0f3a1eed1c0e98f06f48509c97d03a24552c60e323d7a66e976d4fc4f927502"} Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.685452 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7x7ll" event={"ID":"02c230c0-43ab-4476-b3dd-64cb686195c0","Type":"ContainerStarted","Data":"7f6083eeb34ab526e9870c9aecd076d6e165e9ed7443b8138f5d621d4d4742ac"} Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.687053 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-96d9dfbb5-rpg8d" event={"ID":"fb3fda58-5638-4c98-b9e5-597bd7213048","Type":"ContainerStarted","Data":"2f98081a74dd9096c0d00c776adac8f080667e71e2167f53d3d753a244075d68"} Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.688784 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" event={"ID":"8f7dae47-5e1c-4945-9827-33a00c4c0d66","Type":"ContainerStarted","Data":"ab42c22781a8de73e5977f9ad1ebcbde5871741818a83ecf9d2846aa7b18ae78"} Dec 03 12:39:25 crc kubenswrapper[4666]: I1203 12:39:25.690809 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" event={"ID":"65d7f250-10bf-4a17-879f-856d2ea16b91","Type":"ContainerStarted","Data":"831171c494e186d9e6e6f520b76b0c9bc242acee5b7b134fa3e2ae3153485899"} Dec 03 12:39:26 crc kubenswrapper[4666]: I1203 12:39:26.266849 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:26 crc kubenswrapper[4666]: I1203 12:39:26.316644 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:26 crc kubenswrapper[4666]: I1203 12:39:26.500810 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh9p4"] Dec 03 12:39:26 crc kubenswrapper[4666]: I1203 12:39:26.704013 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-96d9dfbb5-rpg8d" event={"ID":"fb3fda58-5638-4c98-b9e5-597bd7213048","Type":"ContainerStarted","Data":"60003b02cadcfe33c32c17e9def14c9debe9055d88ddd85f4192bb99d527b82b"} Dec 03 12:39:26 crc kubenswrapper[4666]: I1203 12:39:26.737828 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-96d9dfbb5-rpg8d" podStartSLOduration=2.737760763 podStartE2EDuration="2.737760763s" podCreationTimestamp="2025-12-03 12:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:39:26.731137894 +0000 UTC m=+1555.576098945" watchObservedRunningTime="2025-12-03 12:39:26.737760763 +0000 UTC m=+1555.582721824" Dec 03 12:39:27 crc kubenswrapper[4666]: I1203 12:39:27.710730 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sh9p4" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="registry-server" containerID="cri-o://7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686" gracePeriod=2 Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.490897 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.586861 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r2md\" (UniqueName: \"kubernetes.io/projected/ba87c49d-1fc8-482e-89d7-61493fb04a13-kube-api-access-4r2md\") pod \"ba87c49d-1fc8-482e-89d7-61493fb04a13\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.587336 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-utilities\") pod \"ba87c49d-1fc8-482e-89d7-61493fb04a13\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.587461 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-catalog-content\") pod \"ba87c49d-1fc8-482e-89d7-61493fb04a13\" (UID: \"ba87c49d-1fc8-482e-89d7-61493fb04a13\") " Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.595310 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-utilities" (OuterVolumeSpecName: "utilities") pod "ba87c49d-1fc8-482e-89d7-61493fb04a13" (UID: "ba87c49d-1fc8-482e-89d7-61493fb04a13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.596364 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba87c49d-1fc8-482e-89d7-61493fb04a13-kube-api-access-4r2md" (OuterVolumeSpecName: "kube-api-access-4r2md") pod "ba87c49d-1fc8-482e-89d7-61493fb04a13" (UID: "ba87c49d-1fc8-482e-89d7-61493fb04a13"). InnerVolumeSpecName "kube-api-access-4r2md". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.688996 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r2md\" (UniqueName: \"kubernetes.io/projected/ba87c49d-1fc8-482e-89d7-61493fb04a13-kube-api-access-4r2md\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.689045 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.717629 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" event={"ID":"f43cdd10-999d-470a-89d0-909660ec7e67","Type":"ContainerStarted","Data":"dd9b70542f897879ad8b9dece91c6ddb12115de9ae12d3d8140e7414eb5ff5e7"} Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.718640 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.719747 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" event={"ID":"65d7f250-10bf-4a17-879f-856d2ea16b91","Type":"ContainerStarted","Data":"54d5de17543486b59afefea2a1aba7b80be382006f99348cfcce04b9dd21d144"} Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.723817 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7x7ll" event={"ID":"02c230c0-43ab-4476-b3dd-64cb686195c0","Type":"ContainerStarted","Data":"97dc0aaacb55c5c795e92dc8ce0161847a2986af09baeec864d84ba1c521d9ce"} Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.723996 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.726301 4666 generic.go:334] "Generic (PLEG): container finished" podID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerID="7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686" exitCode=0 Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.726379 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh9p4" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.726380 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh9p4" event={"ID":"ba87c49d-1fc8-482e-89d7-61493fb04a13","Type":"ContainerDied","Data":"7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686"} Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.726500 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh9p4" event={"ID":"ba87c49d-1fc8-482e-89d7-61493fb04a13","Type":"ContainerDied","Data":"ac892d429aa0d91e47dfdd0a428b25b850a859893f3adc44590cc5f8c3633db5"} Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.726533 4666 scope.go:117] "RemoveContainer" containerID="7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.727591 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" event={"ID":"8f7dae47-5e1c-4945-9827-33a00c4c0d66","Type":"ContainerStarted","Data":"090ed8b14452cde79be1363b4bd1f177e5f715000d350024381b6b0d56b56d62"} Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.729861 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba87c49d-1fc8-482e-89d7-61493fb04a13" (UID: "ba87c49d-1fc8-482e-89d7-61493fb04a13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.741101 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" podStartSLOduration=1.542913758 podStartE2EDuration="4.741057325s" podCreationTimestamp="2025-12-03 12:39:24 +0000 UTC" firstStartedPulling="2025-12-03 12:39:25.088014387 +0000 UTC m=+1553.932975438" lastFinishedPulling="2025-12-03 12:39:28.286157954 +0000 UTC m=+1557.131119005" observedRunningTime="2025-12-03 12:39:28.736213404 +0000 UTC m=+1557.581174475" watchObservedRunningTime="2025-12-03 12:39:28.741057325 +0000 UTC m=+1557.586018386" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.745718 4666 scope.go:117] "RemoveContainer" containerID="534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.757857 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7x7ll" podStartSLOduration=1.311152541 podStartE2EDuration="4.757834878s" podCreationTimestamp="2025-12-03 12:39:24 +0000 UTC" firstStartedPulling="2025-12-03 12:39:24.860220537 +0000 UTC m=+1553.705181608" lastFinishedPulling="2025-12-03 12:39:28.306902894 +0000 UTC m=+1557.151863945" observedRunningTime="2025-12-03 12:39:28.750568461 +0000 UTC m=+1557.595529532" watchObservedRunningTime="2025-12-03 12:39:28.757834878 +0000 UTC m=+1557.602795929" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.768329 4666 scope.go:117] "RemoveContainer" containerID="48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.774550 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-m4v87" podStartSLOduration=1.530287797 podStartE2EDuration="4.774520968s" podCreationTimestamp="2025-12-03 12:39:24 +0000 UTC" firstStartedPulling="2025-12-03 12:39:25.031295346 +0000 UTC m=+1553.876256397" lastFinishedPulling="2025-12-03 12:39:28.275528517 +0000 UTC m=+1557.120489568" observedRunningTime="2025-12-03 12:39:28.772155164 +0000 UTC m=+1557.617116215" watchObservedRunningTime="2025-12-03 12:39:28.774520968 +0000 UTC m=+1557.619482019" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.790479 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba87c49d-1fc8-482e-89d7-61493fb04a13-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.792010 4666 scope.go:117] "RemoveContainer" containerID="7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686" Dec 03 12:39:28 crc kubenswrapper[4666]: E1203 12:39:28.792609 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686\": container with ID starting with 7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686 not found: ID does not exist" containerID="7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.792646 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686"} err="failed to get container status \"7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686\": rpc error: code = NotFound desc = could not find container \"7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686\": container with ID starting with 7a9a7a319f658cacba9ab34fd8acbac3b7053e8627e998da97515990da915686 not found: ID does not exist" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.792668 4666 scope.go:117] "RemoveContainer" containerID="534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6" Dec 03 12:39:28 crc kubenswrapper[4666]: E1203 12:39:28.793361 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6\": container with ID starting with 534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6 not found: ID does not exist" containerID="534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.793384 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6"} err="failed to get container status \"534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6\": rpc error: code = NotFound desc = could not find container \"534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6\": container with ID starting with 534cf6a0facaf4546cf8582d9ed1b5b95af8d34c5b77bdf93d797d03ae5227e6 not found: ID does not exist" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.793401 4666 scope.go:117] "RemoveContainer" containerID="48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db" Dec 03 12:39:28 crc kubenswrapper[4666]: E1203 12:39:28.793687 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db\": container with ID starting with 48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db not found: ID does not exist" containerID="48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db" Dec 03 12:39:28 crc kubenswrapper[4666]: I1203 12:39:28.793717 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db"} err="failed to get container status \"48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db\": rpc error: code = NotFound desc = could not find container \"48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db\": container with ID starting with 48563195253b1ed3b93559b6c6afee0e86f6d4d8a6c755ff9bce7d8c2684b0db not found: ID does not exist" Dec 03 12:39:29 crc kubenswrapper[4666]: I1203 12:39:29.057111 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh9p4"] Dec 03 12:39:29 crc kubenswrapper[4666]: I1203 12:39:29.063357 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sh9p4"] Dec 03 12:39:29 crc kubenswrapper[4666]: I1203 12:39:29.431769 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" path="/var/lib/kubelet/pods/ba87c49d-1fc8-482e-89d7-61493fb04a13/volumes" Dec 03 12:39:31 crc kubenswrapper[4666]: I1203 12:39:31.761077 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" event={"ID":"65d7f250-10bf-4a17-879f-856d2ea16b91","Type":"ContainerStarted","Data":"48488c6f729ca74e066b10861c3ebefca887c0eb6a2ce4a4809574923ad09076"} Dec 03 12:39:31 crc kubenswrapper[4666]: I1203 12:39:31.784822 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5gp27" podStartSLOduration=1.8963879700000001 podStartE2EDuration="7.784800654s" podCreationTimestamp="2025-12-03 12:39:24 +0000 UTC" firstStartedPulling="2025-12-03 12:39:24.789769415 +0000 UTC m=+1553.634730486" lastFinishedPulling="2025-12-03 12:39:30.678182119 +0000 UTC m=+1559.523143170" observedRunningTime="2025-12-03 12:39:31.782492231 +0000 UTC m=+1560.627453282" watchObservedRunningTime="2025-12-03 12:39:31.784800654 +0000 UTC m=+1560.629761705" Dec 03 12:39:34 crc kubenswrapper[4666]: I1203 12:39:34.859212 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7x7ll" Dec 03 12:39:34 crc kubenswrapper[4666]: I1203 12:39:34.932360 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:34 crc kubenswrapper[4666]: I1203 12:39:34.932613 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:34 crc kubenswrapper[4666]: I1203 12:39:34.939778 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:35 crc kubenswrapper[4666]: I1203 12:39:35.791692 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-96d9dfbb5-rpg8d" Dec 03 12:39:35 crc kubenswrapper[4666]: I1203 12:39:35.857828 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rxcq5"] Dec 03 12:39:44 crc kubenswrapper[4666]: I1203 12:39:44.818061 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vvj59" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.301819 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg"] Dec 03 12:40:00 crc kubenswrapper[4666]: E1203 12:40:00.302940 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="extract-utilities" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.302959 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="extract-utilities" Dec 03 12:40:00 crc kubenswrapper[4666]: E1203 12:40:00.302970 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="extract-content" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.302978 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="extract-content" Dec 03 12:40:00 crc kubenswrapper[4666]: E1203 12:40:00.302990 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="registry-server" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.302997 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="registry-server" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.303138 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba87c49d-1fc8-482e-89d7-61493fb04a13" containerName="registry-server" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.304247 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.307061 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.312667 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg"] Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.402309 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.402377 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbblf\" (UniqueName: \"kubernetes.io/projected/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-kube-api-access-tbblf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.402699 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.504446 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.504547 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbblf\" (UniqueName: \"kubernetes.io/projected/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-kube-api-access-tbblf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.504657 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.505155 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.505309 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.524945 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbblf\" (UniqueName: \"kubernetes.io/projected/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-kube-api-access-tbblf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.626297 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:00 crc kubenswrapper[4666]: I1203 12:40:00.910067 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rxcq5" podUID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" containerName="console" containerID="cri-o://2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d" gracePeriod=15 Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.067847 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg"] Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.788925 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rxcq5_b0b79044-b1ee-45fe-8b35-e9fc44f47e46/console/0.log" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.789743 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.827633 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-service-ca\") pod \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.827719 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-config\") pod \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.827896 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-oauth-serving-cert\") pod \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.828237 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-trusted-ca-bundle\") pod \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829032 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-config" (OuterVolumeSpecName: "console-config") pod "b0b79044-b1ee-45fe-8b35-e9fc44f47e46" (UID: "b0b79044-b1ee-45fe-8b35-e9fc44f47e46"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829141 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b0b79044-b1ee-45fe-8b35-e9fc44f47e46" (UID: "b0b79044-b1ee-45fe-8b35-e9fc44f47e46"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829168 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-service-ca" (OuterVolumeSpecName: "service-ca") pod "b0b79044-b1ee-45fe-8b35-e9fc44f47e46" (UID: "b0b79044-b1ee-45fe-8b35-e9fc44f47e46"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829204 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b0b79044-b1ee-45fe-8b35-e9fc44f47e46" (UID: "b0b79044-b1ee-45fe-8b35-e9fc44f47e46"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829347 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-oauth-config\") pod \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829421 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-serving-cert\") pod \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829449 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4r66\" (UniqueName: \"kubernetes.io/projected/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-kube-api-access-q4r66\") pod \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\" (UID: \"b0b79044-b1ee-45fe-8b35-e9fc44f47e46\") " Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829748 4666 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829776 4666 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829788 4666 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.829799 4666 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.836797 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-kube-api-access-q4r66" (OuterVolumeSpecName: "kube-api-access-q4r66") pod "b0b79044-b1ee-45fe-8b35-e9fc44f47e46" (UID: "b0b79044-b1ee-45fe-8b35-e9fc44f47e46"). InnerVolumeSpecName "kube-api-access-q4r66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.839624 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b0b79044-b1ee-45fe-8b35-e9fc44f47e46" (UID: "b0b79044-b1ee-45fe-8b35-e9fc44f47e46"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.840273 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b0b79044-b1ee-45fe-8b35-e9fc44f47e46" (UID: "b0b79044-b1ee-45fe-8b35-e9fc44f47e46"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.931564 4666 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.931646 4666 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.931678 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4r66\" (UniqueName: \"kubernetes.io/projected/b0b79044-b1ee-45fe-8b35-e9fc44f47e46-kube-api-access-q4r66\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.987143 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rxcq5_b0b79044-b1ee-45fe-8b35-e9fc44f47e46/console/0.log" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.987568 4666 generic.go:334] "Generic (PLEG): container finished" podID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" containerID="2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d" exitCode=2 Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.987694 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxcq5" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.987626 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxcq5" event={"ID":"b0b79044-b1ee-45fe-8b35-e9fc44f47e46","Type":"ContainerDied","Data":"2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d"} Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.987837 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxcq5" event={"ID":"b0b79044-b1ee-45fe-8b35-e9fc44f47e46","Type":"ContainerDied","Data":"e93d74602875117c1d4cf974caa522c62ea74e6070ffd04bbb5025ad8970e2c9"} Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.987863 4666 scope.go:117] "RemoveContainer" containerID="2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d" Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.989729 4666 generic.go:334] "Generic (PLEG): container finished" podID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerID="44e9f002952c1b795efa65ea15efa11f9d730892c4b44a7daa2e1742fbfa706e" exitCode=0 Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.989795 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" event={"ID":"8d46dfdf-d48c-494f-9535-b3d6c05f3b45","Type":"ContainerDied","Data":"44e9f002952c1b795efa65ea15efa11f9d730892c4b44a7daa2e1742fbfa706e"} Dec 03 12:40:01 crc kubenswrapper[4666]: I1203 12:40:01.990477 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" event={"ID":"8d46dfdf-d48c-494f-9535-b3d6c05f3b45","Type":"ContainerStarted","Data":"21ffe7dcbd740dabe3b7dd9e3f855eba703a34bbaaa6af2827fda2fb625a0be6"} Dec 03 12:40:02 crc kubenswrapper[4666]: I1203 12:40:02.028450 4666 scope.go:117] "RemoveContainer" containerID="2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d" Dec 03 12:40:02 crc kubenswrapper[4666]: E1203 12:40:02.030600 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d\": container with ID starting with 2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d not found: ID does not exist" containerID="2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d" Dec 03 12:40:02 crc kubenswrapper[4666]: I1203 12:40:02.030679 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d"} err="failed to get container status \"2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d\": rpc error: code = NotFound desc = could not find container \"2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d\": container with ID starting with 2c4f1f19bf1d2c06b3c824d2fdfb29fea997d6b063fd57c4dbfd2d86ab4a7f4d not found: ID does not exist" Dec 03 12:40:02 crc kubenswrapper[4666]: I1203 12:40:02.047311 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rxcq5"] Dec 03 12:40:02 crc kubenswrapper[4666]: I1203 12:40:02.053389 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rxcq5"] Dec 03 12:40:03 crc kubenswrapper[4666]: I1203 12:40:03.433536 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" path="/var/lib/kubelet/pods/b0b79044-b1ee-45fe-8b35-e9fc44f47e46/volumes" Dec 03 12:40:04 crc kubenswrapper[4666]: I1203 12:40:04.011995 4666 generic.go:334] "Generic (PLEG): container finished" podID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerID="d645859ca6b95c3548cf0bd72266867faf9cf2ebafcd39b7b7641cf59112b076" exitCode=0 Dec 03 12:40:04 crc kubenswrapper[4666]: I1203 12:40:04.012065 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" event={"ID":"8d46dfdf-d48c-494f-9535-b3d6c05f3b45","Type":"ContainerDied","Data":"d645859ca6b95c3548cf0bd72266867faf9cf2ebafcd39b7b7641cf59112b076"} Dec 03 12:40:05 crc kubenswrapper[4666]: I1203 12:40:05.024700 4666 generic.go:334] "Generic (PLEG): container finished" podID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerID="586b59af4d38c57a03f1c94dc6c2cacb7ab481dc64c0a42dae083e918e5af97a" exitCode=0 Dec 03 12:40:05 crc kubenswrapper[4666]: I1203 12:40:05.024751 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" event={"ID":"8d46dfdf-d48c-494f-9535-b3d6c05f3b45","Type":"ContainerDied","Data":"586b59af4d38c57a03f1c94dc6c2cacb7ab481dc64c0a42dae083e918e5af97a"} Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.418815 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.510812 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-bundle\") pod \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.510947 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbblf\" (UniqueName: \"kubernetes.io/projected/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-kube-api-access-tbblf\") pod \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.511009 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-util\") pod \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\" (UID: \"8d46dfdf-d48c-494f-9535-b3d6c05f3b45\") " Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.512645 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-bundle" (OuterVolumeSpecName: "bundle") pod "8d46dfdf-d48c-494f-9535-b3d6c05f3b45" (UID: "8d46dfdf-d48c-494f-9535-b3d6c05f3b45"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.520375 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-kube-api-access-tbblf" (OuterVolumeSpecName: "kube-api-access-tbblf") pod "8d46dfdf-d48c-494f-9535-b3d6c05f3b45" (UID: "8d46dfdf-d48c-494f-9535-b3d6c05f3b45"). InnerVolumeSpecName "kube-api-access-tbblf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.569688 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-util" (OuterVolumeSpecName: "util") pod "8d46dfdf-d48c-494f-9535-b3d6c05f3b45" (UID: "8d46dfdf-d48c-494f-9535-b3d6c05f3b45"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.612415 4666 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.612472 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbblf\" (UniqueName: \"kubernetes.io/projected/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-kube-api-access-tbblf\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:06 crc kubenswrapper[4666]: I1203 12:40:06.612488 4666 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d46dfdf-d48c-494f-9535-b3d6c05f3b45-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:07 crc kubenswrapper[4666]: I1203 12:40:07.050326 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" event={"ID":"8d46dfdf-d48c-494f-9535-b3d6c05f3b45","Type":"ContainerDied","Data":"21ffe7dcbd740dabe3b7dd9e3f855eba703a34bbaaa6af2827fda2fb625a0be6"} Dec 03 12:40:07 crc kubenswrapper[4666]: I1203 12:40:07.051369 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ffe7dcbd740dabe3b7dd9e3f855eba703a34bbaaa6af2827fda2fb625a0be6" Dec 03 12:40:07 crc kubenswrapper[4666]: I1203 12:40:07.051592 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.821891 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb"] Dec 03 12:40:15 crc kubenswrapper[4666]: E1203 12:40:15.822959 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerName="extract" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.822978 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerName="extract" Dec 03 12:40:15 crc kubenswrapper[4666]: E1203 12:40:15.822998 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" containerName="console" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.823008 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" containerName="console" Dec 03 12:40:15 crc kubenswrapper[4666]: E1203 12:40:15.823021 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerName="pull" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.823034 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerName="pull" Dec 03 12:40:15 crc kubenswrapper[4666]: E1203 12:40:15.823055 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerName="util" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.823062 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerName="util" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.823199 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b79044-b1ee-45fe-8b35-e9fc44f47e46" containerName="console" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.823218 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d46dfdf-d48c-494f-9535-b3d6c05f3b45" containerName="extract" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.823711 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.826417 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.826861 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.827066 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.827208 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.827334 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fp9sg" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.839464 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb"] Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.936896 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-apiservice-cert\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.936993 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdnxb\" (UniqueName: \"kubernetes.io/projected/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-kube-api-access-fdnxb\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:15 crc kubenswrapper[4666]: I1203 12:40:15.937074 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-webhook-cert\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.038727 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-apiservice-cert\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.038813 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdnxb\" (UniqueName: \"kubernetes.io/projected/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-kube-api-access-fdnxb\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.038839 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-webhook-cert\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.045468 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-webhook-cert\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.049693 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-apiservice-cert\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.054006 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdnxb\" (UniqueName: \"kubernetes.io/projected/c594fca4-0d6a-47e1-acc4-b9434ce17bb9-kube-api-access-fdnxb\") pod \"metallb-operator-controller-manager-b8d9b7676-d2hwb\" (UID: \"c594fca4-0d6a-47e1-acc4-b9434ce17bb9\") " pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.146966 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.195401 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-868f45c797-svsbn"] Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.196600 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.200658 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.200716 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8dvvl" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.201073 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.230872 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-868f45c797-svsbn"] Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.243220 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9606ffd7-351c-4485-b17a-779a724a1859-apiservice-cert\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.243294 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzlcf\" (UniqueName: \"kubernetes.io/projected/9606ffd7-351c-4485-b17a-779a724a1859-kube-api-access-dzlcf\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.243338 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9606ffd7-351c-4485-b17a-779a724a1859-webhook-cert\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.346469 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9606ffd7-351c-4485-b17a-779a724a1859-webhook-cert\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.347235 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9606ffd7-351c-4485-b17a-779a724a1859-apiservice-cert\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.347297 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzlcf\" (UniqueName: \"kubernetes.io/projected/9606ffd7-351c-4485-b17a-779a724a1859-kube-api-access-dzlcf\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.351959 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9606ffd7-351c-4485-b17a-779a724a1859-apiservice-cert\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.354377 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9606ffd7-351c-4485-b17a-779a724a1859-webhook-cert\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.369151 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzlcf\" (UniqueName: \"kubernetes.io/projected/9606ffd7-351c-4485-b17a-779a724a1859-kube-api-access-dzlcf\") pod \"metallb-operator-webhook-server-868f45c797-svsbn\" (UID: \"9606ffd7-351c-4485-b17a-779a724a1859\") " pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.478797 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb"] Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.534438 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:16 crc kubenswrapper[4666]: I1203 12:40:16.805756 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-868f45c797-svsbn"] Dec 03 12:40:16 crc kubenswrapper[4666]: W1203 12:40:16.815018 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9606ffd7_351c_4485_b17a_779a724a1859.slice/crio-38979caca52a2957c45a8ebaaf315afcc91c2d78eb63a659633d9355341ab72d WatchSource:0}: Error finding container 38979caca52a2957c45a8ebaaf315afcc91c2d78eb63a659633d9355341ab72d: Status 404 returned error can't find the container with id 38979caca52a2957c45a8ebaaf315afcc91c2d78eb63a659633d9355341ab72d Dec 03 12:40:17 crc kubenswrapper[4666]: I1203 12:40:17.119017 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" event={"ID":"9606ffd7-351c-4485-b17a-779a724a1859","Type":"ContainerStarted","Data":"38979caca52a2957c45a8ebaaf315afcc91c2d78eb63a659633d9355341ab72d"} Dec 03 12:40:17 crc kubenswrapper[4666]: I1203 12:40:17.122121 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" event={"ID":"c594fca4-0d6a-47e1-acc4-b9434ce17bb9","Type":"ContainerStarted","Data":"0302b6dc371d3f42a39d629555e951ca14b1070f2ac00a43bc71bab4e6c97978"} Dec 03 12:40:23 crc kubenswrapper[4666]: I1203 12:40:23.173575 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" event={"ID":"c594fca4-0d6a-47e1-acc4-b9434ce17bb9","Type":"ContainerStarted","Data":"ee2fe544b12a66c096da716e1883b4d1ac81aee0976b1a7a728121ec9d0c4f29"} Dec 03 12:40:23 crc kubenswrapper[4666]: I1203 12:40:23.174516 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:23 crc kubenswrapper[4666]: I1203 12:40:23.175527 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" event={"ID":"9606ffd7-351c-4485-b17a-779a724a1859","Type":"ContainerStarted","Data":"7e6225947efb89fd57774cee34bf2c7397e7ba885fbc89155c6030c309e497a6"} Dec 03 12:40:23 crc kubenswrapper[4666]: I1203 12:40:23.175926 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:23 crc kubenswrapper[4666]: I1203 12:40:23.204599 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" podStartSLOduration=2.615361568 podStartE2EDuration="8.204573555s" podCreationTimestamp="2025-12-03 12:40:15 +0000 UTC" firstStartedPulling="2025-12-03 12:40:16.490562713 +0000 UTC m=+1605.335523764" lastFinishedPulling="2025-12-03 12:40:22.0797747 +0000 UTC m=+1610.924735751" observedRunningTime="2025-12-03 12:40:23.195469559 +0000 UTC m=+1612.040430640" watchObservedRunningTime="2025-12-03 12:40:23.204573555 +0000 UTC m=+1612.049534636" Dec 03 12:40:23 crc kubenswrapper[4666]: I1203 12:40:23.219245 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" podStartSLOduration=1.935570451 podStartE2EDuration="7.21921735s" podCreationTimestamp="2025-12-03 12:40:16 +0000 UTC" firstStartedPulling="2025-12-03 12:40:16.818899186 +0000 UTC m=+1605.663860237" lastFinishedPulling="2025-12-03 12:40:22.102546095 +0000 UTC m=+1610.947507136" observedRunningTime="2025-12-03 12:40:23.214289357 +0000 UTC m=+1612.059250438" watchObservedRunningTime="2025-12-03 12:40:23.21921735 +0000 UTC m=+1612.064178431" Dec 03 12:40:36 crc kubenswrapper[4666]: I1203 12:40:36.541040 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-868f45c797-svsbn" Dec 03 12:40:39 crc kubenswrapper[4666]: I1203 12:40:39.865809 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:40:39 crc kubenswrapper[4666]: I1203 12:40:39.866320 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:40:56 crc kubenswrapper[4666]: I1203 12:40:56.151377 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-b8d9b7676-d2hwb" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.010099 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xpxmn"] Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.012392 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.013735 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk"] Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.014482 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.015597 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-v2lw9" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.016412 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.017059 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.019513 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.051248 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk"] Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.080426 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-27md4"] Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.081426 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.083791 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.084072 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.084427 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wcrns" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.084589 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.115878 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-4zn7s"] Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.117346 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.121177 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.128723 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-4zn7s"] Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.161204 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-metrics\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.161257 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-metrics-certs\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.161298 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbvcq\" (UniqueName: \"kubernetes.io/projected/eaa2e763-b8bc-4f22-9bbf-43d36d8c2088-kube-api-access-zbvcq\") pod \"frr-k8s-webhook-server-7fcb986d4-xjlzk\" (UID: \"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.161325 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-conf\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.162812 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qnsn\" (UniqueName: \"kubernetes.io/projected/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-kube-api-access-8qnsn\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.162880 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-reloader\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.162913 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaa2e763-b8bc-4f22-9bbf-43d36d8c2088-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xjlzk\" (UID: \"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.163311 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-sockets\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.163419 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-startup\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.264677 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-metrics-certs\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.264734 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-metrics\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.264778 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9sg5\" (UniqueName: \"kubernetes.io/projected/782afea8-e67f-4724-992b-6d318c9f9e5c-kube-api-access-c9sg5\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.264812 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbvcq\" (UniqueName: \"kubernetes.io/projected/eaa2e763-b8bc-4f22-9bbf-43d36d8c2088-kube-api-access-zbvcq\") pod \"frr-k8s-webhook-server-7fcb986d4-xjlzk\" (UID: \"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.264833 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c1d0b522-d828-4f16-9d3b-64b16697898a-metallb-excludel2\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.264860 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-conf\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265048 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qnsn\" (UniqueName: \"kubernetes.io/projected/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-kube-api-access-8qnsn\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265148 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-reloader\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265190 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaa2e763-b8bc-4f22-9bbf-43d36d8c2088-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xjlzk\" (UID: \"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265336 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-metrics-certs\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265492 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-sockets\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265546 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-startup\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265577 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqzl\" (UniqueName: \"kubernetes.io/projected/c1d0b522-d828-4f16-9d3b-64b16697898a-kube-api-access-9jqzl\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265650 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265726 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-metrics-certs\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265758 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-cert\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.265812 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-metrics\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.266365 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-startup\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.266780 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-conf\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.266953 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-frr-sockets\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.267013 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-reloader\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.280902 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaa2e763-b8bc-4f22-9bbf-43d36d8c2088-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xjlzk\" (UID: \"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.287121 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-metrics-certs\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.287650 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qnsn\" (UniqueName: \"kubernetes.io/projected/b1b3e6c6-2466-465d-8b6a-13c75e60ed62-kube-api-access-8qnsn\") pod \"frr-k8s-xpxmn\" (UID: \"b1b3e6c6-2466-465d-8b6a-13c75e60ed62\") " pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.294765 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbvcq\" (UniqueName: \"kubernetes.io/projected/eaa2e763-b8bc-4f22-9bbf-43d36d8c2088-kube-api-access-zbvcq\") pod \"frr-k8s-webhook-server-7fcb986d4-xjlzk\" (UID: \"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.333170 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.346540 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.367835 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9sg5\" (UniqueName: \"kubernetes.io/projected/782afea8-e67f-4724-992b-6d318c9f9e5c-kube-api-access-c9sg5\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.367880 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c1d0b522-d828-4f16-9d3b-64b16697898a-metallb-excludel2\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.367930 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-metrics-certs\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.367966 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqzl\" (UniqueName: \"kubernetes.io/projected/c1d0b522-d828-4f16-9d3b-64b16697898a-kube-api-access-9jqzl\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.367988 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.368014 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-metrics-certs\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.368032 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-cert\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: E1203 12:40:57.368164 4666 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 12:40:57 crc kubenswrapper[4666]: E1203 12:40:57.368167 4666 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 03 12:40:57 crc kubenswrapper[4666]: E1203 12:40:57.368233 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist podName:c1d0b522-d828-4f16-9d3b-64b16697898a nodeName:}" failed. No retries permitted until 2025-12-03 12:40:57.868214113 +0000 UTC m=+1646.713175164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist") pod "speaker-27md4" (UID: "c1d0b522-d828-4f16-9d3b-64b16697898a") : secret "metallb-memberlist" not found Dec 03 12:40:57 crc kubenswrapper[4666]: E1203 12:40:57.368300 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-metrics-certs podName:782afea8-e67f-4724-992b-6d318c9f9e5c nodeName:}" failed. No retries permitted until 2025-12-03 12:40:57.868267825 +0000 UTC m=+1646.713228876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-metrics-certs") pod "controller-f8648f98b-4zn7s" (UID: "782afea8-e67f-4724-992b-6d318c9f9e5c") : secret "controller-certs-secret" not found Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.368999 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c1d0b522-d828-4f16-9d3b-64b16697898a-metallb-excludel2\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.371820 4666 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.372237 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-metrics-certs\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.386665 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9sg5\" (UniqueName: \"kubernetes.io/projected/782afea8-e67f-4724-992b-6d318c9f9e5c-kube-api-access-c9sg5\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.391386 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-cert\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.392130 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqzl\" (UniqueName: \"kubernetes.io/projected/c1d0b522-d828-4f16-9d3b-64b16697898a-kube-api-access-9jqzl\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.669104 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk"] Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.878375 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-metrics-certs\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.878847 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:57 crc kubenswrapper[4666]: E1203 12:40:57.879113 4666 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 12:40:57 crc kubenswrapper[4666]: E1203 12:40:57.879252 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist podName:c1d0b522-d828-4f16-9d3b-64b16697898a nodeName:}" failed. No retries permitted until 2025-12-03 12:40:58.879220099 +0000 UTC m=+1647.724181150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist") pod "speaker-27md4" (UID: "c1d0b522-d828-4f16-9d3b-64b16697898a") : secret "metallb-memberlist" not found Dec 03 12:40:57 crc kubenswrapper[4666]: I1203 12:40:57.885074 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/782afea8-e67f-4724-992b-6d318c9f9e5c-metrics-certs\") pod \"controller-f8648f98b-4zn7s\" (UID: \"782afea8-e67f-4724-992b-6d318c9f9e5c\") " pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:58 crc kubenswrapper[4666]: I1203 12:40:58.031594 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:58 crc kubenswrapper[4666]: I1203 12:40:58.229343 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-4zn7s"] Dec 03 12:40:58 crc kubenswrapper[4666]: W1203 12:40:58.238281 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782afea8_e67f_4724_992b_6d318c9f9e5c.slice/crio-bf92820e1f4466d4732cb34f93fe1fb19e6ae8460b6f38921cf2833fc68877cd WatchSource:0}: Error finding container bf92820e1f4466d4732cb34f93fe1fb19e6ae8460b6f38921cf2833fc68877cd: Status 404 returned error can't find the container with id bf92820e1f4466d4732cb34f93fe1fb19e6ae8460b6f38921cf2833fc68877cd Dec 03 12:40:58 crc kubenswrapper[4666]: I1203 12:40:58.440694 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerStarted","Data":"2bab59d805cc8df55f0163aed71d4d20ba7d99d9e2cf3379519173c2e95c4af9"} Dec 03 12:40:58 crc kubenswrapper[4666]: I1203 12:40:58.443069 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-4zn7s" event={"ID":"782afea8-e67f-4724-992b-6d318c9f9e5c","Type":"ContainerStarted","Data":"bf92820e1f4466d4732cb34f93fe1fb19e6ae8460b6f38921cf2833fc68877cd"} Dec 03 12:40:58 crc kubenswrapper[4666]: I1203 12:40:58.444426 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" event={"ID":"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088","Type":"ContainerStarted","Data":"7789d0e23431981180c1e0de8e22927307cdc441e10b14d70c9ffb3a2666e5c4"} Dec 03 12:40:58 crc kubenswrapper[4666]: I1203 12:40:58.898100 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:58 crc kubenswrapper[4666]: I1203 12:40:58.903635 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c1d0b522-d828-4f16-9d3b-64b16697898a-memberlist\") pod \"speaker-27md4\" (UID: \"c1d0b522-d828-4f16-9d3b-64b16697898a\") " pod="metallb-system/speaker-27md4" Dec 03 12:40:59 crc kubenswrapper[4666]: I1203 12:40:59.196077 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-27md4" Dec 03 12:40:59 crc kubenswrapper[4666]: W1203 12:40:59.221570 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1d0b522_d828_4f16_9d3b_64b16697898a.slice/crio-b9ce120d68a9309d73e7ad3578f8ba5a5eb8f1f5742c98f73b74ce7f09002365 WatchSource:0}: Error finding container b9ce120d68a9309d73e7ad3578f8ba5a5eb8f1f5742c98f73b74ce7f09002365: Status 404 returned error can't find the container with id b9ce120d68a9309d73e7ad3578f8ba5a5eb8f1f5742c98f73b74ce7f09002365 Dec 03 12:40:59 crc kubenswrapper[4666]: I1203 12:40:59.455406 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-4zn7s" event={"ID":"782afea8-e67f-4724-992b-6d318c9f9e5c","Type":"ContainerStarted","Data":"e375a0a3bde739b06cf3c4613978468a0a9b909d10e8509feee815fcfaaee955"} Dec 03 12:40:59 crc kubenswrapper[4666]: I1203 12:40:59.455455 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-4zn7s" event={"ID":"782afea8-e67f-4724-992b-6d318c9f9e5c","Type":"ContainerStarted","Data":"fd8e1f05832cef0a57f606d274e36a90f676018e8d932a5b94099a97b0b25e76"} Dec 03 12:40:59 crc kubenswrapper[4666]: I1203 12:40:59.455772 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:40:59 crc kubenswrapper[4666]: I1203 12:40:59.458800 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-27md4" event={"ID":"c1d0b522-d828-4f16-9d3b-64b16697898a","Type":"ContainerStarted","Data":"b9ce120d68a9309d73e7ad3578f8ba5a5eb8f1f5742c98f73b74ce7f09002365"} Dec 03 12:40:59 crc kubenswrapper[4666]: I1203 12:40:59.491748 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-4zn7s" podStartSLOduration=2.491726189 podStartE2EDuration="2.491726189s" podCreationTimestamp="2025-12-03 12:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:40:59.486753615 +0000 UTC m=+1648.331714666" watchObservedRunningTime="2025-12-03 12:40:59.491726189 +0000 UTC m=+1648.336687230" Dec 03 12:41:00 crc kubenswrapper[4666]: I1203 12:41:00.468101 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-27md4" event={"ID":"c1d0b522-d828-4f16-9d3b-64b16697898a","Type":"ContainerStarted","Data":"12f1096f467cbd428b2e4ce7281fe651b468597bfe0053e6c19aa1d4f64a7955"} Dec 03 12:41:00 crc kubenswrapper[4666]: I1203 12:41:00.468477 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-27md4" event={"ID":"c1d0b522-d828-4f16-9d3b-64b16697898a","Type":"ContainerStarted","Data":"d302f8c1a86f4e334cd92daa3c3a02661933e981d4a6d8e5aaf46aa027888ed8"} Dec 03 12:41:00 crc kubenswrapper[4666]: I1203 12:41:00.496693 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-27md4" podStartSLOduration=3.496668289 podStartE2EDuration="3.496668289s" podCreationTimestamp="2025-12-03 12:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:41:00.496263508 +0000 UTC m=+1649.341224559" watchObservedRunningTime="2025-12-03 12:41:00.496668289 +0000 UTC m=+1649.341629340" Dec 03 12:41:01 crc kubenswrapper[4666]: I1203 12:41:01.474898 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-27md4" Dec 03 12:41:06 crc kubenswrapper[4666]: I1203 12:41:06.518936 4666 generic.go:334] "Generic (PLEG): container finished" podID="b1b3e6c6-2466-465d-8b6a-13c75e60ed62" containerID="c6881e22e9c8188425940127dc94b0d3fc092f6c076f8be90c39a55db80aad29" exitCode=0 Dec 03 12:41:06 crc kubenswrapper[4666]: I1203 12:41:06.519070 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerDied","Data":"c6881e22e9c8188425940127dc94b0d3fc092f6c076f8be90c39a55db80aad29"} Dec 03 12:41:06 crc kubenswrapper[4666]: I1203 12:41:06.521742 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" event={"ID":"eaa2e763-b8bc-4f22-9bbf-43d36d8c2088","Type":"ContainerStarted","Data":"1dd151b3d6ca425765b07433473406ba6dea326d5878591e38c728751803489c"} Dec 03 12:41:06 crc kubenswrapper[4666]: I1203 12:41:06.521928 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:41:06 crc kubenswrapper[4666]: I1203 12:41:06.577617 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" podStartSLOduration=2.388216449 podStartE2EDuration="10.57758446s" podCreationTimestamp="2025-12-03 12:40:56 +0000 UTC" firstStartedPulling="2025-12-03 12:40:57.681623584 +0000 UTC m=+1646.526584635" lastFinishedPulling="2025-12-03 12:41:05.870991595 +0000 UTC m=+1654.715952646" observedRunningTime="2025-12-03 12:41:06.571535877 +0000 UTC m=+1655.416496928" watchObservedRunningTime="2025-12-03 12:41:06.57758446 +0000 UTC m=+1655.422545511" Dec 03 12:41:07 crc kubenswrapper[4666]: I1203 12:41:07.531381 4666 generic.go:334] "Generic (PLEG): container finished" podID="b1b3e6c6-2466-465d-8b6a-13c75e60ed62" containerID="b1ae1bac363185067c422242d721c458a2831e4cb8d9aa2d973e6fa839690534" exitCode=0 Dec 03 12:41:07 crc kubenswrapper[4666]: I1203 12:41:07.531440 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerDied","Data":"b1ae1bac363185067c422242d721c458a2831e4cb8d9aa2d973e6fa839690534"} Dec 03 12:41:08 crc kubenswrapper[4666]: I1203 12:41:08.037299 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-4zn7s" Dec 03 12:41:08 crc kubenswrapper[4666]: I1203 12:41:08.542066 4666 generic.go:334] "Generic (PLEG): container finished" podID="b1b3e6c6-2466-465d-8b6a-13c75e60ed62" containerID="57e9fb1269f8555a41fbed6b1c48e26c9b3d8776a9a4f470732c46f4ed4aef81" exitCode=0 Dec 03 12:41:08 crc kubenswrapper[4666]: I1203 12:41:08.542167 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerDied","Data":"57e9fb1269f8555a41fbed6b1c48e26c9b3d8776a9a4f470732c46f4ed4aef81"} Dec 03 12:41:09 crc kubenswrapper[4666]: I1203 12:41:09.201518 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-27md4" Dec 03 12:41:09 crc kubenswrapper[4666]: I1203 12:41:09.559774 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerStarted","Data":"811ac7136f11b4d3a5bf318d59780776341056113fda69b37b20afc4cf00a420"} Dec 03 12:41:09 crc kubenswrapper[4666]: I1203 12:41:09.559846 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerStarted","Data":"83b1b7202edfdc5ca66cdb5db5ed7c63144c41d2be31cd2817ccb9eb8b3eef29"} Dec 03 12:41:09 crc kubenswrapper[4666]: I1203 12:41:09.559862 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerStarted","Data":"596e7351ae0d87d4abc27e50eb595facca61dbc99e6e669f604896d3daa2194d"} Dec 03 12:41:09 crc kubenswrapper[4666]: I1203 12:41:09.865908 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:41:09 crc kubenswrapper[4666]: I1203 12:41:09.865963 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:41:10 crc kubenswrapper[4666]: I1203 12:41:10.573226 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerStarted","Data":"d6c735aaaaf826338a0c274099f8f80fd3baf16c486540d196e9477121fd671a"} Dec 03 12:41:10 crc kubenswrapper[4666]: I1203 12:41:10.573549 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:41:10 crc kubenswrapper[4666]: I1203 12:41:10.573560 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerStarted","Data":"c4dbfffb8ced03a4bd53a886f5bab62b427267b9b70f03b8e273c15e8c388b8e"} Dec 03 12:41:10 crc kubenswrapper[4666]: I1203 12:41:10.573569 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxmn" event={"ID":"b1b3e6c6-2466-465d-8b6a-13c75e60ed62","Type":"ContainerStarted","Data":"69a11a99ecb15b833e6b158fbac3d8b51a0cbcf7a7029c848303263cb0421e7f"} Dec 03 12:41:10 crc kubenswrapper[4666]: I1203 12:41:10.603196 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xpxmn" podStartSLOduration=6.349572382 podStartE2EDuration="14.603162246s" podCreationTimestamp="2025-12-03 12:40:56 +0000 UTC" firstStartedPulling="2025-12-03 12:40:57.600460523 +0000 UTC m=+1646.445421574" lastFinishedPulling="2025-12-03 12:41:05.854050387 +0000 UTC m=+1654.699011438" observedRunningTime="2025-12-03 12:41:10.59778013 +0000 UTC m=+1659.442741211" watchObservedRunningTime="2025-12-03 12:41:10.603162246 +0000 UTC m=+1659.448123297" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.096270 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-swgth"] Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.097684 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swgth" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.099674 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hw25\" (UniqueName: \"kubernetes.io/projected/9dfa4f8e-85f8-4192-87e9-4101e0469040-kube-api-access-2hw25\") pod \"openstack-operator-index-swgth\" (UID: \"9dfa4f8e-85f8-4192-87e9-4101e0469040\") " pod="openstack-operators/openstack-operator-index-swgth" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.101987 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.104268 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6bxbn" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.105035 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.113287 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swgth"] Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.200514 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hw25\" (UniqueName: \"kubernetes.io/projected/9dfa4f8e-85f8-4192-87e9-4101e0469040-kube-api-access-2hw25\") pod \"openstack-operator-index-swgth\" (UID: \"9dfa4f8e-85f8-4192-87e9-4101e0469040\") " pod="openstack-operators/openstack-operator-index-swgth" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.221912 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hw25\" (UniqueName: \"kubernetes.io/projected/9dfa4f8e-85f8-4192-87e9-4101e0469040-kube-api-access-2hw25\") pod \"openstack-operator-index-swgth\" (UID: \"9dfa4f8e-85f8-4192-87e9-4101e0469040\") " pod="openstack-operators/openstack-operator-index-swgth" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.335711 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.398706 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.419767 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swgth" Dec 03 12:41:12 crc kubenswrapper[4666]: I1203 12:41:12.691348 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swgth"] Dec 03 12:41:13 crc kubenswrapper[4666]: I1203 12:41:13.596624 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swgth" event={"ID":"9dfa4f8e-85f8-4192-87e9-4101e0469040","Type":"ContainerStarted","Data":"0a43dee098d08b5bfc05166dcb3ea80a1178f2248800a4ec5bfd3e2dd0193f62"} Dec 03 12:41:15 crc kubenswrapper[4666]: I1203 12:41:15.856541 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-swgth"] Dec 03 12:41:16 crc kubenswrapper[4666]: I1203 12:41:16.465933 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z7hzf"] Dec 03 12:41:16 crc kubenswrapper[4666]: I1203 12:41:16.467392 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:16 crc kubenswrapper[4666]: I1203 12:41:16.468637 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79r2l\" (UniqueName: \"kubernetes.io/projected/04142222-9a39-4c8f-81b1-df4035625463-kube-api-access-79r2l\") pod \"openstack-operator-index-z7hzf\" (UID: \"04142222-9a39-4c8f-81b1-df4035625463\") " pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:16 crc kubenswrapper[4666]: I1203 12:41:16.476880 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z7hzf"] Dec 03 12:41:16 crc kubenswrapper[4666]: I1203 12:41:16.570139 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79r2l\" (UniqueName: \"kubernetes.io/projected/04142222-9a39-4c8f-81b1-df4035625463-kube-api-access-79r2l\") pod \"openstack-operator-index-z7hzf\" (UID: \"04142222-9a39-4c8f-81b1-df4035625463\") " pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:16 crc kubenswrapper[4666]: I1203 12:41:16.591299 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79r2l\" (UniqueName: \"kubernetes.io/projected/04142222-9a39-4c8f-81b1-df4035625463-kube-api-access-79r2l\") pod \"openstack-operator-index-z7hzf\" (UID: \"04142222-9a39-4c8f-81b1-df4035625463\") " pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:16 crc kubenswrapper[4666]: I1203 12:41:16.792293 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:17 crc kubenswrapper[4666]: I1203 12:41:17.351575 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xjlzk" Dec 03 12:41:17 crc kubenswrapper[4666]: I1203 12:41:17.360581 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z7hzf"] Dec 03 12:41:18 crc kubenswrapper[4666]: W1203 12:41:18.597754 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04142222_9a39_4c8f_81b1_df4035625463.slice/crio-b463f03e1329ba9b8b5a346f7db664efe312090d49433fe1271a741620a373fe WatchSource:0}: Error finding container b463f03e1329ba9b8b5a346f7db664efe312090d49433fe1271a741620a373fe: Status 404 returned error can't find the container with id b463f03e1329ba9b8b5a346f7db664efe312090d49433fe1271a741620a373fe Dec 03 12:41:18 crc kubenswrapper[4666]: I1203 12:41:18.628611 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z7hzf" event={"ID":"04142222-9a39-4c8f-81b1-df4035625463","Type":"ContainerStarted","Data":"b463f03e1329ba9b8b5a346f7db664efe312090d49433fe1271a741620a373fe"} Dec 03 12:41:19 crc kubenswrapper[4666]: I1203 12:41:19.639912 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z7hzf" event={"ID":"04142222-9a39-4c8f-81b1-df4035625463","Type":"ContainerStarted","Data":"9e63a68c3643184279cf703a1e75a5726eeba5cfce390a6f96da98243fa01493"} Dec 03 12:41:19 crc kubenswrapper[4666]: I1203 12:41:19.643030 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swgth" event={"ID":"9dfa4f8e-85f8-4192-87e9-4101e0469040","Type":"ContainerStarted","Data":"52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de"} Dec 03 12:41:19 crc kubenswrapper[4666]: I1203 12:41:19.643362 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-swgth" podUID="9dfa4f8e-85f8-4192-87e9-4101e0469040" containerName="registry-server" containerID="cri-o://52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de" gracePeriod=2 Dec 03 12:41:19 crc kubenswrapper[4666]: I1203 12:41:19.665862 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z7hzf" podStartSLOduration=3.596103051 podStartE2EDuration="3.665826743s" podCreationTimestamp="2025-12-03 12:41:16 +0000 UTC" firstStartedPulling="2025-12-03 12:41:18.602096616 +0000 UTC m=+1667.447057667" lastFinishedPulling="2025-12-03 12:41:18.671820308 +0000 UTC m=+1667.516781359" observedRunningTime="2025-12-03 12:41:19.658933897 +0000 UTC m=+1668.503894998" watchObservedRunningTime="2025-12-03 12:41:19.665826743 +0000 UTC m=+1668.510787824" Dec 03 12:41:19 crc kubenswrapper[4666]: I1203 12:41:19.683873 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-swgth" podStartSLOduration=1.7890496329999999 podStartE2EDuration="7.683840949s" podCreationTimestamp="2025-12-03 12:41:12 +0000 UTC" firstStartedPulling="2025-12-03 12:41:12.718378029 +0000 UTC m=+1661.563339080" lastFinishedPulling="2025-12-03 12:41:18.613169345 +0000 UTC m=+1667.458130396" observedRunningTime="2025-12-03 12:41:19.682551165 +0000 UTC m=+1668.527512216" watchObservedRunningTime="2025-12-03 12:41:19.683840949 +0000 UTC m=+1668.528802020" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.142028 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swgth" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.228135 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hw25\" (UniqueName: \"kubernetes.io/projected/9dfa4f8e-85f8-4192-87e9-4101e0469040-kube-api-access-2hw25\") pod \"9dfa4f8e-85f8-4192-87e9-4101e0469040\" (UID: \"9dfa4f8e-85f8-4192-87e9-4101e0469040\") " Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.237589 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfa4f8e-85f8-4192-87e9-4101e0469040-kube-api-access-2hw25" (OuterVolumeSpecName: "kube-api-access-2hw25") pod "9dfa4f8e-85f8-4192-87e9-4101e0469040" (UID: "9dfa4f8e-85f8-4192-87e9-4101e0469040"). InnerVolumeSpecName "kube-api-access-2hw25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.329515 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hw25\" (UniqueName: \"kubernetes.io/projected/9dfa4f8e-85f8-4192-87e9-4101e0469040-kube-api-access-2hw25\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.652352 4666 generic.go:334] "Generic (PLEG): container finished" podID="9dfa4f8e-85f8-4192-87e9-4101e0469040" containerID="52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de" exitCode=0 Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.652426 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swgth" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.652426 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swgth" event={"ID":"9dfa4f8e-85f8-4192-87e9-4101e0469040","Type":"ContainerDied","Data":"52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de"} Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.652498 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swgth" event={"ID":"9dfa4f8e-85f8-4192-87e9-4101e0469040","Type":"ContainerDied","Data":"0a43dee098d08b5bfc05166dcb3ea80a1178f2248800a4ec5bfd3e2dd0193f62"} Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.652525 4666 scope.go:117] "RemoveContainer" containerID="52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.676305 4666 scope.go:117] "RemoveContainer" containerID="52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de" Dec 03 12:41:20 crc kubenswrapper[4666]: E1203 12:41:20.676838 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de\": container with ID starting with 52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de not found: ID does not exist" containerID="52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.676912 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de"} err="failed to get container status \"52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de\": rpc error: code = NotFound desc = could not find container \"52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de\": container with ID starting with 52fb5510668c4b197bef4b8317c42381544f943dc4748cb039338efc068bb5de not found: ID does not exist" Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.688297 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-swgth"] Dec 03 12:41:20 crc kubenswrapper[4666]: I1203 12:41:20.695993 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-swgth"] Dec 03 12:41:21 crc kubenswrapper[4666]: I1203 12:41:21.434207 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dfa4f8e-85f8-4192-87e9-4101e0469040" path="/var/lib/kubelet/pods/9dfa4f8e-85f8-4192-87e9-4101e0469040/volumes" Dec 03 12:41:26 crc kubenswrapper[4666]: I1203 12:41:26.792830 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:26 crc kubenswrapper[4666]: I1203 12:41:26.793631 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:26 crc kubenswrapper[4666]: I1203 12:41:26.836552 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:27 crc kubenswrapper[4666]: I1203 12:41:27.337050 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xpxmn" Dec 03 12:41:27 crc kubenswrapper[4666]: I1203 12:41:27.752219 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-z7hzf" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.143216 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6"] Dec 03 12:41:32 crc kubenswrapper[4666]: E1203 12:41:32.144533 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfa4f8e-85f8-4192-87e9-4101e0469040" containerName="registry-server" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.144554 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfa4f8e-85f8-4192-87e9-4101e0469040" containerName="registry-server" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.144694 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfa4f8e-85f8-4192-87e9-4101e0469040" containerName="registry-server" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.145717 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.156456 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pksqr" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.163313 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6"] Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.216647 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-bundle\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.216822 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-util\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.217014 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsjb\" (UniqueName: \"kubernetes.io/projected/9ad938cb-69f7-46d8-9831-21332a984dfc-kube-api-access-xxsjb\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.317781 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsjb\" (UniqueName: \"kubernetes.io/projected/9ad938cb-69f7-46d8-9831-21332a984dfc-kube-api-access-xxsjb\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.317967 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-bundle\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.318027 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-util\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.318758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-bundle\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.319223 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-util\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.357431 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsjb\" (UniqueName: \"kubernetes.io/projected/9ad938cb-69f7-46d8-9831-21332a984dfc-kube-api-access-xxsjb\") pod \"febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.465842 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:32 crc kubenswrapper[4666]: I1203 12:41:32.755466 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6"] Dec 03 12:41:32 crc kubenswrapper[4666]: W1203 12:41:32.760905 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ad938cb_69f7_46d8_9831_21332a984dfc.slice/crio-1664faeebd18b467d5781fdbbea049fd7d39073fb2922ec656ec109ece62d171 WatchSource:0}: Error finding container 1664faeebd18b467d5781fdbbea049fd7d39073fb2922ec656ec109ece62d171: Status 404 returned error can't find the container with id 1664faeebd18b467d5781fdbbea049fd7d39073fb2922ec656ec109ece62d171 Dec 03 12:41:33 crc kubenswrapper[4666]: I1203 12:41:33.771382 4666 generic.go:334] "Generic (PLEG): container finished" podID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerID="ae5796fcafec76c17556867b8ca69fa2f1e8684fe5c20a2526db2cd9a09fcef5" exitCode=0 Dec 03 12:41:33 crc kubenswrapper[4666]: I1203 12:41:33.771437 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" event={"ID":"9ad938cb-69f7-46d8-9831-21332a984dfc","Type":"ContainerDied","Data":"ae5796fcafec76c17556867b8ca69fa2f1e8684fe5c20a2526db2cd9a09fcef5"} Dec 03 12:41:33 crc kubenswrapper[4666]: I1203 12:41:33.771476 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" event={"ID":"9ad938cb-69f7-46d8-9831-21332a984dfc","Type":"ContainerStarted","Data":"1664faeebd18b467d5781fdbbea049fd7d39073fb2922ec656ec109ece62d171"} Dec 03 12:41:34 crc kubenswrapper[4666]: I1203 12:41:34.787138 4666 generic.go:334] "Generic (PLEG): container finished" podID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerID="320ba7f710c2387a914b1629d92dd0554759b16031f14be86d36d664ab0c0f9b" exitCode=0 Dec 03 12:41:34 crc kubenswrapper[4666]: I1203 12:41:34.787251 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" event={"ID":"9ad938cb-69f7-46d8-9831-21332a984dfc","Type":"ContainerDied","Data":"320ba7f710c2387a914b1629d92dd0554759b16031f14be86d36d664ab0c0f9b"} Dec 03 12:41:35 crc kubenswrapper[4666]: I1203 12:41:35.797038 4666 generic.go:334] "Generic (PLEG): container finished" podID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerID="b7187b21378b89ceedf35abd5b652b6e68d95ad527b15c8883fbba2d347a90b5" exitCode=0 Dec 03 12:41:35 crc kubenswrapper[4666]: I1203 12:41:35.797142 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" event={"ID":"9ad938cb-69f7-46d8-9831-21332a984dfc","Type":"ContainerDied","Data":"b7187b21378b89ceedf35abd5b652b6e68d95ad527b15c8883fbba2d347a90b5"} Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.079961 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.200172 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-util\") pod \"9ad938cb-69f7-46d8-9831-21332a984dfc\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.200232 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxsjb\" (UniqueName: \"kubernetes.io/projected/9ad938cb-69f7-46d8-9831-21332a984dfc-kube-api-access-xxsjb\") pod \"9ad938cb-69f7-46d8-9831-21332a984dfc\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.200258 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-bundle\") pod \"9ad938cb-69f7-46d8-9831-21332a984dfc\" (UID: \"9ad938cb-69f7-46d8-9831-21332a984dfc\") " Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.201159 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-bundle" (OuterVolumeSpecName: "bundle") pod "9ad938cb-69f7-46d8-9831-21332a984dfc" (UID: "9ad938cb-69f7-46d8-9831-21332a984dfc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.209055 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad938cb-69f7-46d8-9831-21332a984dfc-kube-api-access-xxsjb" (OuterVolumeSpecName: "kube-api-access-xxsjb") pod "9ad938cb-69f7-46d8-9831-21332a984dfc" (UID: "9ad938cb-69f7-46d8-9831-21332a984dfc"). InnerVolumeSpecName "kube-api-access-xxsjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.220686 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-util" (OuterVolumeSpecName: "util") pod "9ad938cb-69f7-46d8-9831-21332a984dfc" (UID: "9ad938cb-69f7-46d8-9831-21332a984dfc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.301702 4666 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-util\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.301739 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxsjb\" (UniqueName: \"kubernetes.io/projected/9ad938cb-69f7-46d8-9831-21332a984dfc-kube-api-access-xxsjb\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.301757 4666 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ad938cb-69f7-46d8-9831-21332a984dfc-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.813978 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" event={"ID":"9ad938cb-69f7-46d8-9831-21332a984dfc","Type":"ContainerDied","Data":"1664faeebd18b467d5781fdbbea049fd7d39073fb2922ec656ec109ece62d171"} Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.814075 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1664faeebd18b467d5781fdbbea049fd7d39073fb2922ec656ec109ece62d171" Dec 03 12:41:37 crc kubenswrapper[4666]: I1203 12:41:37.814028 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6" Dec 03 12:41:39 crc kubenswrapper[4666]: I1203 12:41:39.866386 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:41:39 crc kubenswrapper[4666]: I1203 12:41:39.866719 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:41:39 crc kubenswrapper[4666]: I1203 12:41:39.866758 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:41:39 crc kubenswrapper[4666]: I1203 12:41:39.867378 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:41:39 crc kubenswrapper[4666]: I1203 12:41:39.867436 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" gracePeriod=600 Dec 03 12:41:39 crc kubenswrapper[4666]: E1203 12:41:39.999696 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:41:40 crc kubenswrapper[4666]: I1203 12:41:40.838575 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" exitCode=0 Dec 03 12:41:40 crc kubenswrapper[4666]: I1203 12:41:40.838645 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69"} Dec 03 12:41:40 crc kubenswrapper[4666]: I1203 12:41:40.838697 4666 scope.go:117] "RemoveContainer" containerID="ce96a0ab731e8d61bc9ca2f38ac40f0b4915f4493598279cd146755b95731fdb" Dec 03 12:41:40 crc kubenswrapper[4666]: I1203 12:41:40.839424 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:41:40 crc kubenswrapper[4666]: E1203 12:41:40.839848 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.101024 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns"] Dec 03 12:41:44 crc kubenswrapper[4666]: E1203 12:41:44.101788 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerName="pull" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.101804 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerName="pull" Dec 03 12:41:44 crc kubenswrapper[4666]: E1203 12:41:44.101827 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerName="extract" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.101835 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerName="extract" Dec 03 12:41:44 crc kubenswrapper[4666]: E1203 12:41:44.101851 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerName="util" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.101858 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerName="util" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.101987 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad938cb-69f7-46d8-9831-21332a984dfc" containerName="extract" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.102523 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.104867 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-h9dpt" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.125185 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns"] Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.296222 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrfv9\" (UniqueName: \"kubernetes.io/projected/7aa8983e-49b3-4356-aae3-5388d37ae886-kube-api-access-rrfv9\") pod \"openstack-operator-controller-operator-7f9cd9598-chsns\" (UID: \"7aa8983e-49b3-4356-aae3-5388d37ae886\") " pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.398674 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrfv9\" (UniqueName: \"kubernetes.io/projected/7aa8983e-49b3-4356-aae3-5388d37ae886-kube-api-access-rrfv9\") pod \"openstack-operator-controller-operator-7f9cd9598-chsns\" (UID: \"7aa8983e-49b3-4356-aae3-5388d37ae886\") " pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.420275 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrfv9\" (UniqueName: \"kubernetes.io/projected/7aa8983e-49b3-4356-aae3-5388d37ae886-kube-api-access-rrfv9\") pod \"openstack-operator-controller-operator-7f9cd9598-chsns\" (UID: \"7aa8983e-49b3-4356-aae3-5388d37ae886\") " pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" Dec 03 12:41:44 crc kubenswrapper[4666]: I1203 12:41:44.719166 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" Dec 03 12:41:45 crc kubenswrapper[4666]: I1203 12:41:45.041676 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns"] Dec 03 12:41:45 crc kubenswrapper[4666]: W1203 12:41:45.050571 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa8983e_49b3_4356_aae3_5388d37ae886.slice/crio-f14248348efe4bc009d92e327e1fbc2cfa4a19a09c1ce8f26a7eb1b8c0fff15b WatchSource:0}: Error finding container f14248348efe4bc009d92e327e1fbc2cfa4a19a09c1ce8f26a7eb1b8c0fff15b: Status 404 returned error can't find the container with id f14248348efe4bc009d92e327e1fbc2cfa4a19a09c1ce8f26a7eb1b8c0fff15b Dec 03 12:41:45 crc kubenswrapper[4666]: I1203 12:41:45.908367 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" event={"ID":"7aa8983e-49b3-4356-aae3-5388d37ae886","Type":"ContainerStarted","Data":"f14248348efe4bc009d92e327e1fbc2cfa4a19a09c1ce8f26a7eb1b8c0fff15b"} Dec 03 12:41:49 crc kubenswrapper[4666]: I1203 12:41:49.939692 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" event={"ID":"7aa8983e-49b3-4356-aae3-5388d37ae886","Type":"ContainerStarted","Data":"50c8eb6d229de9214e2e59c47ff95a6fbbd668447458ce2165cbaca6164dbde7"} Dec 03 12:41:49 crc kubenswrapper[4666]: I1203 12:41:49.940586 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" Dec 03 12:41:49 crc kubenswrapper[4666]: I1203 12:41:49.986123 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" podStartSLOduration=2.1042213849999998 podStartE2EDuration="5.986061983s" podCreationTimestamp="2025-12-03 12:41:44 +0000 UTC" firstStartedPulling="2025-12-03 12:41:45.052833554 +0000 UTC m=+1693.897794615" lastFinishedPulling="2025-12-03 12:41:48.934674152 +0000 UTC m=+1697.779635213" observedRunningTime="2025-12-03 12:41:49.979169707 +0000 UTC m=+1698.824130768" watchObservedRunningTime="2025-12-03 12:41:49.986061983 +0000 UTC m=+1698.831023034" Dec 03 12:41:53 crc kubenswrapper[4666]: I1203 12:41:53.423756 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:41:53 crc kubenswrapper[4666]: E1203 12:41:53.425449 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:41:54 crc kubenswrapper[4666]: I1203 12:41:54.724518 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7f9cd9598-chsns" Dec 03 12:42:06 crc kubenswrapper[4666]: I1203 12:42:06.423486 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:42:06 crc kubenswrapper[4666]: E1203 12:42:06.424606 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.813927 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.815817 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.818437 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-n62qj" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.820600 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.822001 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.827689 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.829424 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-scfq8" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.838352 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.874220 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.875361 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.881283 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.881808 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pqs57" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.882535 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.889453 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dtgcg" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.910592 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2gkv\" (UniqueName: \"kubernetes.io/projected/75171f12-3098-437a-a941-31312676f362-kube-api-access-s2gkv\") pod \"cinder-operator-controller-manager-859b6ccc6-c4plc\" (UID: \"75171f12-3098-437a-a941-31312676f362\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.910643 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jxr\" (UniqueName: \"kubernetes.io/projected/72fef244-af95-4c84-889b-04317e2f85e4-kube-api-access-27jxr\") pod \"barbican-operator-controller-manager-7d9dfd778-rp8qs\" (UID: \"72fef244-af95-4c84-889b-04317e2f85e4\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.912325 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.916486 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr"] Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.917686 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.921934 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-899k2" Dec 03 12:42:13 crc kubenswrapper[4666]: I1203 12:42:13.934816 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.017280 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.018152 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2gkv\" (UniqueName: \"kubernetes.io/projected/75171f12-3098-437a-a941-31312676f362-kube-api-access-s2gkv\") pod \"cinder-operator-controller-manager-859b6ccc6-c4plc\" (UID: \"75171f12-3098-437a-a941-31312676f362\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.018204 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrj9h\" (UniqueName: \"kubernetes.io/projected/330ae135-611a-4ae6-ba73-fcb6a911c299-kube-api-access-lrj9h\") pod \"heat-operator-controller-manager-5f64f6f8bb-9cbjr\" (UID: \"330ae135-611a-4ae6-ba73-fcb6a911c299\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.018245 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jxr\" (UniqueName: \"kubernetes.io/projected/72fef244-af95-4c84-889b-04317e2f85e4-kube-api-access-27jxr\") pod \"barbican-operator-controller-manager-7d9dfd778-rp8qs\" (UID: \"72fef244-af95-4c84-889b-04317e2f85e4\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.018294 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg6mv\" (UniqueName: \"kubernetes.io/projected/21ae197d-ae5d-4129-b1db-114a42dc5eb8-kube-api-access-dg6mv\") pod \"designate-operator-controller-manager-78b4bc895b-jzzkf\" (UID: \"21ae197d-ae5d-4129-b1db-114a42dc5eb8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.018322 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc854\" (UniqueName: \"kubernetes.io/projected/6b5f798a-8be3-4c12-948b-4b9ff35d14ba-kube-api-access-lc854\") pod \"glance-operator-controller-manager-77987cd8cd-j6dh9\" (UID: \"6b5f798a-8be3-4c12-948b-4b9ff35d14ba\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.028866 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.030139 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.048164 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.061656 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-m5j57" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.084144 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.085734 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.092823 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.093352 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2gkv\" (UniqueName: \"kubernetes.io/projected/75171f12-3098-437a-a941-31312676f362-kube-api-access-s2gkv\") pod \"cinder-operator-controller-manager-859b6ccc6-c4plc\" (UID: \"75171f12-3098-437a-a941-31312676f362\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.102351 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.103761 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.110627 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.111792 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dccxq" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.112241 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-2w7bc" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.120290 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrj9h\" (UniqueName: \"kubernetes.io/projected/330ae135-611a-4ae6-ba73-fcb6a911c299-kube-api-access-lrj9h\") pod \"heat-operator-controller-manager-5f64f6f8bb-9cbjr\" (UID: \"330ae135-611a-4ae6-ba73-fcb6a911c299\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.120372 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg6mv\" (UniqueName: \"kubernetes.io/projected/21ae197d-ae5d-4129-b1db-114a42dc5eb8-kube-api-access-dg6mv\") pod \"designate-operator-controller-manager-78b4bc895b-jzzkf\" (UID: \"21ae197d-ae5d-4129-b1db-114a42dc5eb8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.120399 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc854\" (UniqueName: \"kubernetes.io/projected/6b5f798a-8be3-4c12-948b-4b9ff35d14ba-kube-api-access-lc854\") pod \"glance-operator-controller-manager-77987cd8cd-j6dh9\" (UID: \"6b5f798a-8be3-4c12-948b-4b9ff35d14ba\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.120456 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf9pr\" (UniqueName: \"kubernetes.io/projected/1d9a62f9-0c20-4033-84d4-ade04922d04a-kube-api-access-lf9pr\") pod \"horizon-operator-controller-manager-68c6d99b8f-dn9wg\" (UID: \"1d9a62f9-0c20-4033-84d4-ade04922d04a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.120731 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jxr\" (UniqueName: \"kubernetes.io/projected/72fef244-af95-4c84-889b-04317e2f85e4-kube-api-access-27jxr\") pod \"barbican-operator-controller-manager-7d9dfd778-rp8qs\" (UID: \"72fef244-af95-4c84-889b-04317e2f85e4\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.129844 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.130862 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.137564 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.144410 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.152651 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-kjskj" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.153169 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.153808 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.155038 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.160674 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-m2dp2" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.160847 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrj9h\" (UniqueName: \"kubernetes.io/projected/330ae135-611a-4ae6-ba73-fcb6a911c299-kube-api-access-lrj9h\") pod \"heat-operator-controller-manager-5f64f6f8bb-9cbjr\" (UID: \"330ae135-611a-4ae6-ba73-fcb6a911c299\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.161950 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.171544 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.171600 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.172954 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.183911 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg6mv\" (UniqueName: \"kubernetes.io/projected/21ae197d-ae5d-4129-b1db-114a42dc5eb8-kube-api-access-dg6mv\") pod \"designate-operator-controller-manager-78b4bc895b-jzzkf\" (UID: \"21ae197d-ae5d-4129-b1db-114a42dc5eb8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.191925 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc854\" (UniqueName: \"kubernetes.io/projected/6b5f798a-8be3-4c12-948b-4b9ff35d14ba-kube-api-access-lc854\") pod \"glance-operator-controller-manager-77987cd8cd-j6dh9\" (UID: \"6b5f798a-8be3-4c12-948b-4b9ff35d14ba\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.194796 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.204390 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-2n5zw" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.213634 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.216450 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.220210 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.223986 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdscx\" (UniqueName: \"kubernetes.io/projected/dce8c65f-3951-4e68-a044-c4c59638fd05-kube-api-access-mdscx\") pod \"ironic-operator-controller-manager-6c548fd776-97svz\" (UID: \"dce8c65f-3951-4e68-a044-c4c59638fd05\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.224048 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqfbs\" (UniqueName: \"kubernetes.io/projected/e4686a7d-808f-47e8-b5cd-ec3af299a7f2-kube-api-access-qqfbs\") pod \"manila-operator-controller-manager-5797d476c-ntgb9\" (UID: \"e4686a7d-808f-47e8-b5cd-ec3af299a7f2\") " pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.224125 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqn7b\" (UniqueName: \"kubernetes.io/projected/b3582f8c-2777-4291-bc6a-42953fd2d928-kube-api-access-tqn7b\") pod \"keystone-operator-controller-manager-7765d96ddf-n287g\" (UID: \"b3582f8c-2777-4291-bc6a-42953fd2d928\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.224173 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf9pr\" (UniqueName: \"kubernetes.io/projected/1d9a62f9-0c20-4033-84d4-ade04922d04a-kube-api-access-lf9pr\") pod \"horizon-operator-controller-manager-68c6d99b8f-dn9wg\" (UID: \"1d9a62f9-0c20-4033-84d4-ade04922d04a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.224208 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.224242 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67wsr\" (UniqueName: \"kubernetes.io/projected/e9197948-361b-43e7-8cc6-db509c80c7b1-kube-api-access-67wsr\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.235336 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.240550 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fqqxs" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.256133 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.257220 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.270139 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.278761 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-725qn" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.299485 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf9pr\" (UniqueName: \"kubernetes.io/projected/1d9a62f9-0c20-4033-84d4-ade04922d04a-kube-api-access-lf9pr\") pod \"horizon-operator-controller-manager-68c6d99b8f-dn9wg\" (UID: \"1d9a62f9-0c20-4033-84d4-ade04922d04a\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332707 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgbd5\" (UniqueName: \"kubernetes.io/projected/78813232-79b6-4483-86cb-069995914531-kube-api-access-zgbd5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7lkjq\" (UID: \"78813232-79b6-4483-86cb-069995914531\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332765 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqn7b\" (UniqueName: \"kubernetes.io/projected/b3582f8c-2777-4291-bc6a-42953fd2d928-kube-api-access-tqn7b\") pod \"keystone-operator-controller-manager-7765d96ddf-n287g\" (UID: \"b3582f8c-2777-4291-bc6a-42953fd2d928\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332797 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk6ht\" (UniqueName: \"kubernetes.io/projected/8048d3e0-a035-4a85-92ad-ca11dc24ccbe-kube-api-access-tk6ht\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-khplb\" (UID: \"8048d3e0-a035-4a85-92ad-ca11dc24ccbe\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332837 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332871 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67wsr\" (UniqueName: \"kubernetes.io/projected/e9197948-361b-43e7-8cc6-db509c80c7b1-kube-api-access-67wsr\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332891 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55w6r\" (UniqueName: \"kubernetes.io/projected/d90e913f-9878-4644-b0f7-d0e313b8f897-kube-api-access-55w6r\") pod \"nova-operator-controller-manager-697bc559fc-ckxvk\" (UID: \"d90e913f-9878-4644-b0f7-d0e313b8f897\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332947 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdscx\" (UniqueName: \"kubernetes.io/projected/dce8c65f-3951-4e68-a044-c4c59638fd05-kube-api-access-mdscx\") pod \"ironic-operator-controller-manager-6c548fd776-97svz\" (UID: \"dce8c65f-3951-4e68-a044-c4c59638fd05\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.332972 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqfbs\" (UniqueName: \"kubernetes.io/projected/e4686a7d-808f-47e8-b5cd-ec3af299a7f2-kube-api-access-qqfbs\") pod \"manila-operator-controller-manager-5797d476c-ntgb9\" (UID: \"e4686a7d-808f-47e8-b5cd-ec3af299a7f2\") " pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" Dec 03 12:42:14 crc kubenswrapper[4666]: E1203 12:42:14.333865 4666 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:14 crc kubenswrapper[4666]: E1203 12:42:14.333932 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert podName:e9197948-361b-43e7-8cc6-db509c80c7b1 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:14.833907402 +0000 UTC m=+1723.678868453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert") pod "infra-operator-controller-manager-57548d458d-vxrg7" (UID: "e9197948-361b-43e7-8cc6-db509c80c7b1") : secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.365565 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.368765 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.379393 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-gfvjk" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.384636 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqfbs\" (UniqueName: \"kubernetes.io/projected/e4686a7d-808f-47e8-b5cd-ec3af299a7f2-kube-api-access-qqfbs\") pod \"manila-operator-controller-manager-5797d476c-ntgb9\" (UID: \"e4686a7d-808f-47e8-b5cd-ec3af299a7f2\") " pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.394379 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.414154 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdscx\" (UniqueName: \"kubernetes.io/projected/dce8c65f-3951-4e68-a044-c4c59638fd05-kube-api-access-mdscx\") pod \"ironic-operator-controller-manager-6c548fd776-97svz\" (UID: \"dce8c65f-3951-4e68-a044-c4c59638fd05\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.416197 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqn7b\" (UniqueName: \"kubernetes.io/projected/b3582f8c-2777-4291-bc6a-42953fd2d928-kube-api-access-tqn7b\") pod \"keystone-operator-controller-manager-7765d96ddf-n287g\" (UID: \"b3582f8c-2777-4291-bc6a-42953fd2d928\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.434858 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgbd5\" (UniqueName: \"kubernetes.io/projected/78813232-79b6-4483-86cb-069995914531-kube-api-access-zgbd5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7lkjq\" (UID: \"78813232-79b6-4483-86cb-069995914531\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.434913 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6ht\" (UniqueName: \"kubernetes.io/projected/8048d3e0-a035-4a85-92ad-ca11dc24ccbe-kube-api-access-tk6ht\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-khplb\" (UID: \"8048d3e0-a035-4a85-92ad-ca11dc24ccbe\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.434969 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55w6r\" (UniqueName: \"kubernetes.io/projected/d90e913f-9878-4644-b0f7-d0e313b8f897-kube-api-access-55w6r\") pod \"nova-operator-controller-manager-697bc559fc-ckxvk\" (UID: \"d90e913f-9878-4644-b0f7-d0e313b8f897\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.435008 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsnzp\" (UniqueName: \"kubernetes.io/projected/5a914e37-4302-4c77-8d4b-6c509dfbfc4e-kube-api-access-gsnzp\") pod \"octavia-operator-controller-manager-998648c74-q5d7b\" (UID: \"5a914e37-4302-4c77-8d4b-6c509dfbfc4e\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.446699 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67wsr\" (UniqueName: \"kubernetes.io/projected/e9197948-361b-43e7-8cc6-db509c80c7b1-kube-api-access-67wsr\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.477773 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.479841 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55w6r\" (UniqueName: \"kubernetes.io/projected/d90e913f-9878-4644-b0f7-d0e313b8f897-kube-api-access-55w6r\") pod \"nova-operator-controller-manager-697bc559fc-ckxvk\" (UID: \"d90e913f-9878-4644-b0f7-d0e313b8f897\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.482071 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgbd5\" (UniqueName: \"kubernetes.io/projected/78813232-79b6-4483-86cb-069995914531-kube-api-access-zgbd5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-7lkjq\" (UID: \"78813232-79b6-4483-86cb-069995914531\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.502569 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk6ht\" (UniqueName: \"kubernetes.io/projected/8048d3e0-a035-4a85-92ad-ca11dc24ccbe-kube-api-access-tk6ht\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-khplb\" (UID: \"8048d3e0-a035-4a85-92ad-ca11dc24ccbe\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.503740 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.531748 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.538757 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsnzp\" (UniqueName: \"kubernetes.io/projected/5a914e37-4302-4c77-8d4b-6c509dfbfc4e-kube-api-access-gsnzp\") pod \"octavia-operator-controller-manager-998648c74-q5d7b\" (UID: \"5a914e37-4302-4c77-8d4b-6c509dfbfc4e\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.572386 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.595238 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.602996 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsnzp\" (UniqueName: \"kubernetes.io/projected/5a914e37-4302-4c77-8d4b-6c509dfbfc4e-kube-api-access-gsnzp\") pod \"octavia-operator-controller-manager-998648c74-q5d7b\" (UID: \"5a914e37-4302-4c77-8d4b-6c509dfbfc4e\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.611625 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.613182 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.616919 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5vl9v" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.621038 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.622471 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.622670 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.626988 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.627287 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-w5xdz" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.679625 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.728813 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.730987 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.753229 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.755324 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.760058 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tstx9" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.769072 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.771619 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrjc\" (UniqueName: \"kubernetes.io/projected/865a9d83-50b6-49fb-87f8-c46fa1453ed0-kube-api-access-vkrjc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.771943 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.772072 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69z9\" (UniqueName: \"kubernetes.io/projected/1d364f72-b379-4591-b3f4-17997cbcba6e-kube-api-access-q69z9\") pod \"ovn-operator-controller-manager-b6456fdb6-rt2zn\" (UID: \"1d364f72-b379-4591-b3f4-17997cbcba6e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.794282 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.804846 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.818390 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.819996 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.822484 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8nt5w" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.825228 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.846946 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xwp75"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.853003 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.858545 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-j4m5s" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.858799 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.860611 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.862550 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.867035 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-swr5c" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.867272 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.868769 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.870960 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-949rv" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.875299 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xwp75"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.876258 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:14 crc kubenswrapper[4666]: E1203 12:42:14.876567 4666 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:14 crc kubenswrapper[4666]: E1203 12:42:14.876671 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert podName:e9197948-361b-43e7-8cc6-db509c80c7b1 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:15.876639373 +0000 UTC m=+1724.721600424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert") pod "infra-operator-controller-manager-57548d458d-vxrg7" (UID: "e9197948-361b-43e7-8cc6-db509c80c7b1") : secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:14 crc kubenswrapper[4666]: E1203 12:42:14.877257 4666 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:14 crc kubenswrapper[4666]: E1203 12:42:14.877297 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert podName:865a9d83-50b6-49fb-87f8-c46fa1453ed0 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:15.377285981 +0000 UTC m=+1724.222247022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" (UID: "865a9d83-50b6-49fb-87f8-c46fa1453ed0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.876342 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.880930 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q69z9\" (UniqueName: \"kubernetes.io/projected/1d364f72-b379-4591-b3f4-17997cbcba6e-kube-api-access-q69z9\") pod \"ovn-operator-controller-manager-b6456fdb6-rt2zn\" (UID: \"1d364f72-b379-4591-b3f4-17997cbcba6e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.881010 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.881063 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqncb\" (UniqueName: \"kubernetes.io/projected/deefb3d8-d96a-4e86-839d-8d8a561f4645-kube-api-access-cqncb\") pod \"placement-operator-controller-manager-78f8948974-bwxhd\" (UID: \"deefb3d8-d96a-4e86-839d-8d8a561f4645\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.881216 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrjc\" (UniqueName: \"kubernetes.io/projected/865a9d83-50b6-49fb-87f8-c46fa1453ed0-kube-api-access-vkrjc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.891043 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.933503 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.936649 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.945402 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.945417 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.950628 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-pfhsn" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.958747 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.962368 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrjc\" (UniqueName: \"kubernetes.io/projected/865a9d83-50b6-49fb-87f8-c46fa1453ed0-kube-api-access-vkrjc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.978394 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.980079 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.981456 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn"] Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982190 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982227 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgzf\" (UniqueName: \"kubernetes.io/projected/39681ef6-2d50-4509-a81e-d6cd102695cd-kube-api-access-9cgzf\") pod \"test-operator-controller-manager-5854674fcc-xwp75\" (UID: \"39681ef6-2d50-4509-a81e-d6cd102695cd\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982275 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqncb\" (UniqueName: \"kubernetes.io/projected/deefb3d8-d96a-4e86-839d-8d8a561f4645-kube-api-access-cqncb\") pod \"placement-operator-controller-manager-78f8948974-bwxhd\" (UID: \"deefb3d8-d96a-4e86-839d-8d8a561f4645\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982367 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68bl8\" (UniqueName: \"kubernetes.io/projected/20594b02-a42f-4747-abfc-cbee34847d81-kube-api-access-68bl8\") pod \"swift-operator-controller-manager-5f8c65bbfc-x2zqw\" (UID: \"20594b02-a42f-4747-abfc-cbee34847d81\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982401 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqvw\" (UniqueName: \"kubernetes.io/projected/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-kube-api-access-wrqvw\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982427 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzb9\" (UniqueName: \"kubernetes.io/projected/494c67d4-f61e-468c-a8d8-21a877c690e8-kube-api-access-qtzb9\") pod \"telemetry-operator-controller-manager-76cc84c6bb-bcvbt\" (UID: \"494c67d4-f61e-468c-a8d8-21a877c690e8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982483 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982569 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn479\" (UniqueName: \"kubernetes.io/projected/f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f-kube-api-access-dn479\") pod \"watcher-operator-controller-manager-769dc69bc-xmmjr\" (UID: \"f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.982791 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-g5zsz" Dec 03 12:42:14 crc kubenswrapper[4666]: I1203 12:42:14.985351 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69z9\" (UniqueName: \"kubernetes.io/projected/1d364f72-b379-4591-b3f4-17997cbcba6e-kube-api-access-q69z9\") pod \"ovn-operator-controller-manager-b6456fdb6-rt2zn\" (UID: \"1d364f72-b379-4591-b3f4-17997cbcba6e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.005870 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqncb\" (UniqueName: \"kubernetes.io/projected/deefb3d8-d96a-4e86-839d-8d8a561f4645-kube-api-access-cqncb\") pod \"placement-operator-controller-manager-78f8948974-bwxhd\" (UID: \"deefb3d8-d96a-4e86-839d-8d8a561f4645\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083646 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn479\" (UniqueName: \"kubernetes.io/projected/f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f-kube-api-access-dn479\") pod \"watcher-operator-controller-manager-769dc69bc-xmmjr\" (UID: \"f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083704 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083721 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgzf\" (UniqueName: \"kubernetes.io/projected/39681ef6-2d50-4509-a81e-d6cd102695cd-kube-api-access-9cgzf\") pod \"test-operator-controller-manager-5854674fcc-xwp75\" (UID: \"39681ef6-2d50-4509-a81e-d6cd102695cd\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083751 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwpvm\" (UniqueName: \"kubernetes.io/projected/e0637cb9-5703-4e26-b526-592b818a5304-kube-api-access-lwpvm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-d49kn\" (UID: \"e0637cb9-5703-4e26-b526-592b818a5304\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083820 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68bl8\" (UniqueName: \"kubernetes.io/projected/20594b02-a42f-4747-abfc-cbee34847d81-kube-api-access-68bl8\") pod \"swift-operator-controller-manager-5f8c65bbfc-x2zqw\" (UID: \"20594b02-a42f-4747-abfc-cbee34847d81\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083842 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqvw\" (UniqueName: \"kubernetes.io/projected/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-kube-api-access-wrqvw\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083860 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzb9\" (UniqueName: \"kubernetes.io/projected/494c67d4-f61e-468c-a8d8-21a877c690e8-kube-api-access-qtzb9\") pod \"telemetry-operator-controller-manager-76cc84c6bb-bcvbt\" (UID: \"494c67d4-f61e-468c-a8d8-21a877c690e8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.083884 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.084051 4666 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.084044 4666 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.084120 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:15.584103037 +0000 UTC m=+1724.429064088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.086782 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:15.58531506 +0000 UTC m=+1724.430276171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "metrics-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.110616 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqvw\" (UniqueName: \"kubernetes.io/projected/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-kube-api-access-wrqvw\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.113245 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgzf\" (UniqueName: \"kubernetes.io/projected/39681ef6-2d50-4509-a81e-d6cd102695cd-kube-api-access-9cgzf\") pod \"test-operator-controller-manager-5854674fcc-xwp75\" (UID: \"39681ef6-2d50-4509-a81e-d6cd102695cd\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.113418 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn479\" (UniqueName: \"kubernetes.io/projected/f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f-kube-api-access-dn479\") pod \"watcher-operator-controller-manager-769dc69bc-xmmjr\" (UID: \"f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.114639 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzb9\" (UniqueName: \"kubernetes.io/projected/494c67d4-f61e-468c-a8d8-21a877c690e8-kube-api-access-qtzb9\") pod \"telemetry-operator-controller-manager-76cc84c6bb-bcvbt\" (UID: \"494c67d4-f61e-468c-a8d8-21a877c690e8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.122184 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68bl8\" (UniqueName: \"kubernetes.io/projected/20594b02-a42f-4747-abfc-cbee34847d81-kube-api-access-68bl8\") pod \"swift-operator-controller-manager-5f8c65bbfc-x2zqw\" (UID: \"20594b02-a42f-4747-abfc-cbee34847d81\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.139642 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.185069 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwpvm\" (UniqueName: \"kubernetes.io/projected/e0637cb9-5703-4e26-b526-592b818a5304-kube-api-access-lwpvm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-d49kn\" (UID: \"e0637cb9-5703-4e26-b526-592b818a5304\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.195330 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.236443 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwpvm\" (UniqueName: \"kubernetes.io/projected/e0637cb9-5703-4e26-b526-592b818a5304-kube-api-access-lwpvm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-d49kn\" (UID: \"e0637cb9-5703-4e26-b526-592b818a5304\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.304934 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.311609 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.332686 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.349464 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.376488 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.389564 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.390893 4666 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.390971 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert podName:865a9d83-50b6-49fb-87f8-c46fa1453ed0 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:16.390952906 +0000 UTC m=+1725.235913957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" (UID: "865a9d83-50b6-49fb-87f8-c46fa1453ed0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.442706 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.463567 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.592307 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.592797 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.592698 4666 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.593054 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:16.593035575 +0000 UTC m=+1725.437996626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.592990 4666 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.593412 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:16.593402935 +0000 UTC m=+1725.438363976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "metrics-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.728458 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.737206 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs"] Dec 03 12:42:15 crc kubenswrapper[4666]: W1203 12:42:15.742950 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d9a62f9_0c20_4033_84d4_ade04922d04a.slice/crio-43b22fb91974249fcadf9bef27dc356854df3413cfc9f5a23a869beeaa81bde9 WatchSource:0}: Error finding container 43b22fb91974249fcadf9bef27dc356854df3413cfc9f5a23a869beeaa81bde9: Status 404 returned error can't find the container with id 43b22fb91974249fcadf9bef27dc356854df3413cfc9f5a23a869beeaa81bde9 Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.749680 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.769995 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk"] Dec 03 12:42:15 crc kubenswrapper[4666]: W1203 12:42:15.775360 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd90e913f_9878_4644_b0f7_d0e313b8f897.slice/crio-9c47579d6a70d78ea8a4459e622fb573f0753c990b2866afdc6d6072ca9ff0ab WatchSource:0}: Error finding container 9c47579d6a70d78ea8a4459e622fb573f0753c990b2866afdc6d6072ca9ff0ab: Status 404 returned error can't find the container with id 9c47579d6a70d78ea8a4459e622fb573f0753c990b2866afdc6d6072ca9ff0ab Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.780830 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.896705 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.896936 4666 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: E1203 12:42:15.897047 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert podName:e9197948-361b-43e7-8cc6-db509c80c7b1 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:17.897024367 +0000 UTC m=+1726.741985418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert") pod "infra-operator-controller-manager-57548d458d-vxrg7" (UID: "e9197948-361b-43e7-8cc6-db509c80c7b1") : secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:15 crc kubenswrapper[4666]: W1203 12:42:15.965376 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod494c67d4_f61e_468c_a8d8_21a877c690e8.slice/crio-4e7522e8ec55859dc88dbc5e93e3c93c1906b709459e915cf8271220890f5863 WatchSource:0}: Error finding container 4e7522e8ec55859dc88dbc5e93e3c93c1906b709459e915cf8271220890f5863: Status 404 returned error can't find the container with id 4e7522e8ec55859dc88dbc5e93e3c93c1906b709459e915cf8271220890f5863 Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.965426 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd"] Dec 03 12:42:15 crc kubenswrapper[4666]: W1203 12:42:15.968914 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeefb3d8_d96a_4e86_839d_8d8a561f4645.slice/crio-cca6bee882ab668bba3e554435afcceef50c7e49f1a5eeb9b2d546741aeb418f WatchSource:0}: Error finding container cca6bee882ab668bba3e554435afcceef50c7e49f1a5eeb9b2d546741aeb418f: Status 404 returned error can't find the container with id cca6bee882ab668bba3e554435afcceef50c7e49f1a5eeb9b2d546741aeb418f Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.972015 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.986501 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.994155 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz"] Dec 03 12:42:15 crc kubenswrapper[4666]: I1203 12:42:15.998396 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq"] Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.021141 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gsnzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-q5d7b_openstack-operators(5a914e37-4302-4c77-8d4b-6c509dfbfc4e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.021354 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b"] Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.026415 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gsnzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-q5d7b_openstack-operators(5a914e37-4302-4c77-8d4b-6c509dfbfc4e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.027568 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" podUID="5a914e37-4302-4c77-8d4b-6c509dfbfc4e" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.036518 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g"] Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.048786 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9"] Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.049903 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqn7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-n287g_openstack-operators(b3582f8c-2777-4291-bc6a-42953fd2d928): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: W1203 12:42:16.050796 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4686a7d_808f_47e8_b5cd_ec3af299a7f2.slice/crio-cb03f7d23c949683d209506c0bc446db5d3f84cb446909628e7f1d991428b30d WatchSource:0}: Error finding container cb03f7d23c949683d209506c0bc446db5d3f84cb446909628e7f1d991428b30d: Status 404 returned error can't find the container with id cb03f7d23c949683d209506c0bc446db5d3f84cb446909628e7f1d991428b30d Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.052622 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lwpvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-d49kn_openstack-operators(e0637cb9-5703-4e26-b526-592b818a5304): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.053053 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqn7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-n287g_openstack-operators(b3582f8c-2777-4291-bc6a-42953fd2d928): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.055539 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q69z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-rt2zn_openstack-operators(1d364f72-b379-4591-b3f4-17997cbcba6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.055712 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn"] Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.055789 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" podUID="b3582f8c-2777-4291-bc6a-42953fd2d928" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.055836 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" podUID="e0637cb9-5703-4e26-b526-592b818a5304" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.059650 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q69z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-rt2zn_openstack-operators(1d364f72-b379-4591-b3f4-17997cbcba6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.060599 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn"] Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.060843 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" podUID="1d364f72-b379-4591-b3f4-17997cbcba6e" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.062608 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.203:5001/openstack-k8s-operators/manila-operator:67fa2d9ff285fc4fa8544a0e3c8f8bba90fab519,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqfbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5797d476c-ntgb9_openstack-operators(e4686a7d-808f-47e8-b5cd-ec3af299a7f2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.062848 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9cgzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-xwp75_openstack-operators(39681ef6-2d50-4509-a81e-d6cd102695cd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.066185 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xwp75"] Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.066946 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9cgzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-xwp75_openstack-operators(39681ef6-2d50-4509-a81e-d6cd102695cd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.067435 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqfbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5797d476c-ntgb9_openstack-operators(e4686a7d-808f-47e8-b5cd-ec3af299a7f2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.069011 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" podUID="e4686a7d-808f-47e8-b5cd-ec3af299a7f2" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.069078 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" podUID="39681ef6-2d50-4509-a81e-d6cd102695cd" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.093489 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw"] Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.101163 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr"] Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.116622 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dn479,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xmmjr_openstack-operators(f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.118658 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dn479,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xmmjr_openstack-operators(f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.126765 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" podUID="f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.138858 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" event={"ID":"494c67d4-f61e-468c-a8d8-21a877c690e8","Type":"ContainerStarted","Data":"4e7522e8ec55859dc88dbc5e93e3c93c1906b709459e915cf8271220890f5863"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.140780 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" event={"ID":"330ae135-611a-4ae6-ba73-fcb6a911c299","Type":"ContainerStarted","Data":"c6d179488743df931538d8fca72112aac500258d6c37c57fe5b65b82b7d24ba1"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.142383 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" event={"ID":"b3582f8c-2777-4291-bc6a-42953fd2d928","Type":"ContainerStarted","Data":"312b6f62be8c1ed92fc7a5e31412778d20412d3be3f604bc672b3e8a2110d16a"} Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.145368 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" podUID="b3582f8c-2777-4291-bc6a-42953fd2d928" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.146296 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" event={"ID":"f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f","Type":"ContainerStarted","Data":"314567ceabac906d8021a2a0a408de4fb8ecab47c33ee13a13aec6b323ce415e"} Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.148775 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" podUID="f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.150863 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" event={"ID":"dce8c65f-3951-4e68-a044-c4c59638fd05","Type":"ContainerStarted","Data":"702b4625a6c749745cb40f938f873551e10e6222d1baa1fe3b18757775e9c17d"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.153872 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" event={"ID":"8048d3e0-a035-4a85-92ad-ca11dc24ccbe","Type":"ContainerStarted","Data":"d91965c24e3ec1c27844cc1e1aff3842c63d533e605f583b7d9e33d4532ccde4"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.155631 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" event={"ID":"5a914e37-4302-4c77-8d4b-6c509dfbfc4e","Type":"ContainerStarted","Data":"621f4b7a10d8e4c988e2eed6196911792d45085fb8df6cea15de69f2e89b639d"} Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.167786 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" podUID="5a914e37-4302-4c77-8d4b-6c509dfbfc4e" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.168626 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" event={"ID":"1d364f72-b379-4591-b3f4-17997cbcba6e","Type":"ContainerStarted","Data":"cb966e0f74ef2ca5b7cc5278526083f96a39a4fafce0733f60949173424189e4"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.172288 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" event={"ID":"6b5f798a-8be3-4c12-948b-4b9ff35d14ba","Type":"ContainerStarted","Data":"2fef6c29b4efe1dc3f27d6101eb20db00090e70d65dedaebda4ce476533eae82"} Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.173008 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" podUID="1d364f72-b379-4591-b3f4-17997cbcba6e" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.174216 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" event={"ID":"e0637cb9-5703-4e26-b526-592b818a5304","Type":"ContainerStarted","Data":"53685f47b456d00b104ff9c3699c5f5eeaa4a30b3271bd6b2daaf0dcadf76ab4"} Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.175722 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" podUID="e0637cb9-5703-4e26-b526-592b818a5304" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.176333 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" event={"ID":"21ae197d-ae5d-4129-b1db-114a42dc5eb8","Type":"ContainerStarted","Data":"78f6175b2887489cd75b915777805640512fdbfcd3aaeca73868e48c1288a051"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.176977 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" event={"ID":"72fef244-af95-4c84-889b-04317e2f85e4","Type":"ContainerStarted","Data":"121ed1593f5a32d1cfbb9d3f93e2a459f34747d317e4e9daffd59e6e1725e618"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.185762 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" event={"ID":"d90e913f-9878-4644-b0f7-d0e313b8f897","Type":"ContainerStarted","Data":"9c47579d6a70d78ea8a4459e622fb573f0753c990b2866afdc6d6072ca9ff0ab"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.186858 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" event={"ID":"deefb3d8-d96a-4e86-839d-8d8a561f4645","Type":"ContainerStarted","Data":"cca6bee882ab668bba3e554435afcceef50c7e49f1a5eeb9b2d546741aeb418f"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.188404 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" event={"ID":"e4686a7d-808f-47e8-b5cd-ec3af299a7f2","Type":"ContainerStarted","Data":"cb03f7d23c949683d209506c0bc446db5d3f84cb446909628e7f1d991428b30d"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.193450 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" event={"ID":"39681ef6-2d50-4509-a81e-d6cd102695cd","Type":"ContainerStarted","Data":"3d3648f89246597d0758c9104a3ac87feec145962990517bec5d4509df574ee2"} Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.193471 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/openstack-k8s-operators/manila-operator:67fa2d9ff285fc4fa8544a0e3c8f8bba90fab519\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" podUID="e4686a7d-808f-47e8-b5cd-ec3af299a7f2" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.195068 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" event={"ID":"20594b02-a42f-4747-abfc-cbee34847d81","Type":"ContainerStarted","Data":"4350b74c6476ebc77910ff7195d35a839f2b169ae4db43d42fd64a451f57481d"} Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.195202 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" podUID="39681ef6-2d50-4509-a81e-d6cd102695cd" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.195845 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" event={"ID":"1d9a62f9-0c20-4033-84d4-ade04922d04a","Type":"ContainerStarted","Data":"43b22fb91974249fcadf9bef27dc356854df3413cfc9f5a23a869beeaa81bde9"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.196703 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" event={"ID":"78813232-79b6-4483-86cb-069995914531","Type":"ContainerStarted","Data":"b0c4d3d40504b95f64227cbd3d2eb3f631f20d9d22ff3fd4be4fb55b3cac2b52"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.197344 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" event={"ID":"75171f12-3098-437a-a941-31312676f362","Type":"ContainerStarted","Data":"bb4f533ef8071ddb54b46dc5fa648c5f63d0c3b01aba12cef9771f89c56eca61"} Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.410316 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.410533 4666 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.410629 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert podName:865a9d83-50b6-49fb-87f8-c46fa1453ed0 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:18.4106053 +0000 UTC m=+1727.255566351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" (UID: "865a9d83-50b6-49fb-87f8-c46fa1453ed0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.614817 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:16 crc kubenswrapper[4666]: I1203 12:42:16.614944 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.615104 4666 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.615155 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:18.615140305 +0000 UTC m=+1727.460101356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "webhook-server-cert" not found Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.615500 4666 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 12:42:16 crc kubenswrapper[4666]: E1203 12:42:16.615536 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:18.615529046 +0000 UTC m=+1727.460490097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "metrics-server-cert" not found Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.229900 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" podUID="e0637cb9-5703-4e26-b526-592b818a5304" Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.231482 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/openstack-k8s-operators/manila-operator:67fa2d9ff285fc4fa8544a0e3c8f8bba90fab519\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" podUID="e4686a7d-808f-47e8-b5cd-ec3af299a7f2" Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.235379 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" podUID="5a914e37-4302-4c77-8d4b-6c509dfbfc4e" Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.235583 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" podUID="b3582f8c-2777-4291-bc6a-42953fd2d928" Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.236483 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" podUID="1d364f72-b379-4591-b3f4-17997cbcba6e" Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.237518 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" podUID="f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f" Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.237883 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" podUID="39681ef6-2d50-4509-a81e-d6cd102695cd" Dec 03 12:42:17 crc kubenswrapper[4666]: I1203 12:42:17.946374 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.946659 4666 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:17 crc kubenswrapper[4666]: E1203 12:42:17.946722 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert podName:e9197948-361b-43e7-8cc6-db509c80c7b1 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:21.946704484 +0000 UTC m=+1730.791665535 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert") pod "infra-operator-controller-manager-57548d458d-vxrg7" (UID: "e9197948-361b-43e7-8cc6-db509c80c7b1") : secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:18 crc kubenswrapper[4666]: I1203 12:42:18.455848 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:18 crc kubenswrapper[4666]: E1203 12:42:18.456147 4666 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:18 crc kubenswrapper[4666]: E1203 12:42:18.456291 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert podName:865a9d83-50b6-49fb-87f8-c46fa1453ed0 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:22.456238828 +0000 UTC m=+1731.301199879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" (UID: "865a9d83-50b6-49fb-87f8-c46fa1453ed0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:18 crc kubenswrapper[4666]: I1203 12:42:18.659228 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:18 crc kubenswrapper[4666]: I1203 12:42:18.659324 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:18 crc kubenswrapper[4666]: E1203 12:42:18.659488 4666 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 12:42:18 crc kubenswrapper[4666]: E1203 12:42:18.659544 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:22.659529289 +0000 UTC m=+1731.504490340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "metrics-server-cert" not found Dec 03 12:42:18 crc kubenswrapper[4666]: E1203 12:42:18.659960 4666 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 12:42:18 crc kubenswrapper[4666]: E1203 12:42:18.659996 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:22.659988492 +0000 UTC m=+1731.504949543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "webhook-server-cert" not found Dec 03 12:42:21 crc kubenswrapper[4666]: I1203 12:42:21.436714 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:42:21 crc kubenswrapper[4666]: E1203 12:42:21.437696 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:42:22 crc kubenswrapper[4666]: I1203 12:42:22.010535 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.010927 4666 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.011053 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert podName:e9197948-361b-43e7-8cc6-db509c80c7b1 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:30.010995981 +0000 UTC m=+1738.855957052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert") pod "infra-operator-controller-manager-57548d458d-vxrg7" (UID: "e9197948-361b-43e7-8cc6-db509c80c7b1") : secret "infra-operator-webhook-server-cert" not found Dec 03 12:42:22 crc kubenswrapper[4666]: I1203 12:42:22.519034 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.519570 4666 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.519650 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert podName:865a9d83-50b6-49fb-87f8-c46fa1453ed0 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:30.519626881 +0000 UTC m=+1739.364587932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" (UID: "865a9d83-50b6-49fb-87f8-c46fa1453ed0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 12:42:22 crc kubenswrapper[4666]: I1203 12:42:22.721949 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.722178 4666 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.722602 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:30.722576803 +0000 UTC m=+1739.567537854 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "webhook-server-cert" not found Dec 03 12:42:22 crc kubenswrapper[4666]: I1203 12:42:22.722520 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.722644 4666 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 12:42:22 crc kubenswrapper[4666]: E1203 12:42:22.722724 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs podName:ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45 nodeName:}" failed. No retries permitted until 2025-12-03 12:42:30.722701887 +0000 UTC m=+1739.567663028 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs") pod "openstack-operator-controller-manager-54468f9998-5pr6c" (UID: "ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45") : secret "metrics-server-cert" not found Dec 03 12:42:29 crc kubenswrapper[4666]: E1203 12:42:29.518651 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 03 12:42:29 crc kubenswrapper[4666]: E1203 12:42:29.519893 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lf9pr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-dn9wg_openstack-operators(1d9a62f9-0c20-4033-84d4-ade04922d04a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.051125 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.061209 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9197948-361b-43e7-8cc6-db509c80c7b1-cert\") pod \"infra-operator-controller-manager-57548d458d-vxrg7\" (UID: \"e9197948-361b-43e7-8cc6-db509c80c7b1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.159623 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.559947 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.575481 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/865a9d83-50b6-49fb-87f8-c46fa1453ed0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt\" (UID: \"865a9d83-50b6-49fb-87f8-c46fa1453ed0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.763444 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.763915 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.764020 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.770141 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-metrics-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.770882 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45-webhook-certs\") pod \"openstack-operator-controller-manager-54468f9998-5pr6c\" (UID: \"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45\") " pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:30 crc kubenswrapper[4666]: I1203 12:42:30.919381 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:33 crc kubenswrapper[4666]: I1203 12:42:33.424054 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:42:33 crc kubenswrapper[4666]: E1203 12:42:33.425361 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:42:38 crc kubenswrapper[4666]: E1203 12:42:38.363361 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 03 12:42:38 crc kubenswrapper[4666]: E1203 12:42:38.364075 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-55w6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-ckxvk_openstack-operators(d90e913f-9878-4644-b0f7-d0e313b8f897): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.031506 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c"] Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.127741 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7"] Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.138055 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt"] Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.470652 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" event={"ID":"75171f12-3098-437a-a941-31312676f362","Type":"ContainerStarted","Data":"f46d50b81fb1734b365a12e44535b7c2ab9b7edeca04d52b7769a916194be3b0"} Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.473460 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" event={"ID":"20594b02-a42f-4747-abfc-cbee34847d81","Type":"ContainerStarted","Data":"08896b0f5bd4aac54b190361cf522ba7e2a1e1b4767d947e4c2843ac78b6b687"} Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.478029 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" event={"ID":"6b5f798a-8be3-4c12-948b-4b9ff35d14ba","Type":"ContainerStarted","Data":"de8ba68269d2d56f1161d28cd489a814b9cb1bd719b5ad9faaa3046dc6eb8315"} Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.482455 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" event={"ID":"78813232-79b6-4483-86cb-069995914531","Type":"ContainerStarted","Data":"4d6fff2b15b405072e14a83aca98748cf36b61b7a162eb7bbb76b6f4d6a7d0b6"} Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.484456 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" event={"ID":"494c67d4-f61e-468c-a8d8-21a877c690e8","Type":"ContainerStarted","Data":"cc726454b8ff9bea4efce5cfeac5fb8a0d070d1a9472efb152908e5708bcf6a7"} Dec 03 12:42:39 crc kubenswrapper[4666]: I1203 12:42:39.498320 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" event={"ID":"72fef244-af95-4c84-889b-04317e2f85e4","Type":"ContainerStarted","Data":"8ed864459e2841f056212211412f1decd6e26e445ba7b9d9caa42ad6dd8730f6"} Dec 03 12:42:39 crc kubenswrapper[4666]: W1203 12:42:39.552467 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9197948_361b_43e7_8cc6_db509c80c7b1.slice/crio-93449fd45420683f03e4319bc1bbbd1ddd3d32bf861538a4468e31b9990dfa2e WatchSource:0}: Error finding container 93449fd45420683f03e4319bc1bbbd1ddd3d32bf861538a4468e31b9990dfa2e: Status 404 returned error can't find the container with id 93449fd45420683f03e4319bc1bbbd1ddd3d32bf861538a4468e31b9990dfa2e Dec 03 12:42:39 crc kubenswrapper[4666]: W1203 12:42:39.556664 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865a9d83_50b6_49fb_87f8_c46fa1453ed0.slice/crio-a2dc9af19b8ac5a4619e5d5b139918b37fc5ec7e6c3e46ac0f4f51cc6dcc5e83 WatchSource:0}: Error finding container a2dc9af19b8ac5a4619e5d5b139918b37fc5ec7e6c3e46ac0f4f51cc6dcc5e83: Status 404 returned error can't find the container with id a2dc9af19b8ac5a4619e5d5b139918b37fc5ec7e6c3e46ac0f4f51cc6dcc5e83 Dec 03 12:42:40 crc kubenswrapper[4666]: I1203 12:42:40.521979 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" event={"ID":"865a9d83-50b6-49fb-87f8-c46fa1453ed0","Type":"ContainerStarted","Data":"a2dc9af19b8ac5a4619e5d5b139918b37fc5ec7e6c3e46ac0f4f51cc6dcc5e83"} Dec 03 12:42:40 crc kubenswrapper[4666]: I1203 12:42:40.529757 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" event={"ID":"dce8c65f-3951-4e68-a044-c4c59638fd05","Type":"ContainerStarted","Data":"8118a4cc10cb387db6d7553ae4b723e012197b684e806e95a48773a34a441d67"} Dec 03 12:42:40 crc kubenswrapper[4666]: I1203 12:42:40.550155 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" event={"ID":"e9197948-361b-43e7-8cc6-db509c80c7b1","Type":"ContainerStarted","Data":"93449fd45420683f03e4319bc1bbbd1ddd3d32bf861538a4468e31b9990dfa2e"} Dec 03 12:42:40 crc kubenswrapper[4666]: I1203 12:42:40.566050 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" event={"ID":"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45","Type":"ContainerStarted","Data":"630dff50f696fc9b7f7fa59f410773c32b428faaf35dc34b97119d0e0646ff31"} Dec 03 12:42:40 crc kubenswrapper[4666]: I1203 12:42:40.582254 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" event={"ID":"deefb3d8-d96a-4e86-839d-8d8a561f4645","Type":"ContainerStarted","Data":"a206def116ac3c4e39e156cad67a5acb186b2ca8c4a8e8422fcf6fb51c01ea3c"} Dec 03 12:42:41 crc kubenswrapper[4666]: I1203 12:42:41.607790 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" event={"ID":"8048d3e0-a035-4a85-92ad-ca11dc24ccbe","Type":"ContainerStarted","Data":"0b1a099febfc776da7019fe270c33a26d4a6e4fc1c2bea2c5758b4037f014a44"} Dec 03 12:42:41 crc kubenswrapper[4666]: I1203 12:42:41.632761 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" event={"ID":"330ae135-611a-4ae6-ba73-fcb6a911c299","Type":"ContainerStarted","Data":"b0749e84d4f818d4d045a2f159e139008b06793e431cba7842bd14c600dfd267"} Dec 03 12:42:41 crc kubenswrapper[4666]: I1203 12:42:41.669021 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" event={"ID":"ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45","Type":"ContainerStarted","Data":"e990e7aa923bd4112d655b5aeb3546d902912d56107089c5e3ef50733b7bbadd"} Dec 03 12:42:41 crc kubenswrapper[4666]: I1203 12:42:41.669130 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:41 crc kubenswrapper[4666]: I1203 12:42:41.673190 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" event={"ID":"21ae197d-ae5d-4129-b1db-114a42dc5eb8","Type":"ContainerStarted","Data":"c5fd7aa96bc820b75f3e5ef138918f2b5d35c5f2b39905f5d76ed25cdcd988eb"} Dec 03 12:42:41 crc kubenswrapper[4666]: I1203 12:42:41.717104 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" podStartSLOduration=27.717058843 podStartE2EDuration="27.717058843s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:42:41.715435739 +0000 UTC m=+1750.560396790" watchObservedRunningTime="2025-12-03 12:42:41.717058843 +0000 UTC m=+1750.562019894" Dec 03 12:42:48 crc kubenswrapper[4666]: I1203 12:42:48.425550 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:42:48 crc kubenswrapper[4666]: E1203 12:42:48.426564 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:42:50 crc kubenswrapper[4666]: E1203 12:42:50.155164 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" podUID="1d9a62f9-0c20-4033-84d4-ade04922d04a" Dec 03 12:42:50 crc kubenswrapper[4666]: E1203 12:42:50.364752 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" podUID="d90e913f-9878-4644-b0f7-d0e313b8f897" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.763838 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" event={"ID":"e9197948-361b-43e7-8cc6-db509c80c7b1","Type":"ContainerStarted","Data":"38f4bb737a975e82d0d64a9c1c9e47c83532d5a8d3f2f08ba37de36c9ccc71a9"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.774697 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" event={"ID":"5a914e37-4302-4c77-8d4b-6c509dfbfc4e","Type":"ContainerStarted","Data":"d9e41e71ede6e41800987b716d2914dc1af87adc1da97c3c791504fde54ca6e1"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.778457 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" event={"ID":"1d9a62f9-0c20-4033-84d4-ade04922d04a","Type":"ContainerStarted","Data":"a1c8dd73d5eac449fa58a0ab90f608d4d2d6be3c630ec1474341e7f7975a812a"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.783029 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" event={"ID":"b3582f8c-2777-4291-bc6a-42953fd2d928","Type":"ContainerStarted","Data":"e3d0e28815c0760c8b663f07b69e9c9e9637eda35b863de1f19b94299d421672"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.787789 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" event={"ID":"1d364f72-b379-4591-b3f4-17997cbcba6e","Type":"ContainerStarted","Data":"6f68acd9bf77e6e60564bfc9426d65e2af8c6eaae282688b04dc56b9012314f9"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.813824 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" event={"ID":"865a9d83-50b6-49fb-87f8-c46fa1453ed0","Type":"ContainerStarted","Data":"563fea541792e87e8ff986275cc8b197aff6edf745017a20724c4d77413dd094"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.813897 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" event={"ID":"865a9d83-50b6-49fb-87f8-c46fa1453ed0","Type":"ContainerStarted","Data":"a8be3c55e2932a3afef5b668068d48b8e4e45ec4c1a13bef3173e2bdbb6ec5fc"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.814737 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.829752 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" event={"ID":"39681ef6-2d50-4509-a81e-d6cd102695cd","Type":"ContainerStarted","Data":"8727fb531b530e52df49a7ef92b8954f77bb5743faf0e5cce968ecf2d196e80b"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.845433 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" event={"ID":"494c67d4-f61e-468c-a8d8-21a877c690e8","Type":"ContainerStarted","Data":"182a36e2c9865fbc53950a80407e571a3dfe9a104a6cc3c6c765cc7fc7f07473"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.848665 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.854393 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.861116 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" event={"ID":"e0637cb9-5703-4e26-b526-592b818a5304","Type":"ContainerStarted","Data":"d390abd101e65cbb539f13a36402d573277cc7acce88145d818f749ac93eeb75"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.867749 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" podStartSLOduration=26.927637758 podStartE2EDuration="36.867730427s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:39.56686254 +0000 UTC m=+1748.411823591" lastFinishedPulling="2025-12-03 12:42:49.506955209 +0000 UTC m=+1758.351916260" observedRunningTime="2025-12-03 12:42:50.859994808 +0000 UTC m=+1759.704955859" watchObservedRunningTime="2025-12-03 12:42:50.867730427 +0000 UTC m=+1759.712691478" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.888411 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" event={"ID":"f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f","Type":"ContainerStarted","Data":"f613422abd8c4e47b29d54a625911a5662657879ccd395aa5e6ed026ef051546"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.888462 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" event={"ID":"f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f","Type":"ContainerStarted","Data":"d6d9b9039584bfc92bc9190ac09e40a6cda8589ceda32d6ba8ee988f17b71553"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.889075 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.912005 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" event={"ID":"75171f12-3098-437a-a941-31312676f362","Type":"ContainerStarted","Data":"8b9a0d91308cfa46120e3f4d9b66e4a9697feca14ab8c8a276eec2417bd223e4"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.912937 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.914544 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-bcvbt" podStartSLOduration=3.251380572 podStartE2EDuration="36.914525601s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.970679046 +0000 UTC m=+1724.815640097" lastFinishedPulling="2025-12-03 12:42:49.633824065 +0000 UTC m=+1758.478785126" observedRunningTime="2025-12-03 12:42:50.909433433 +0000 UTC m=+1759.754394474" watchObservedRunningTime="2025-12-03 12:42:50.914525601 +0000 UTC m=+1759.759486652" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.914956 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.936830 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" event={"ID":"20594b02-a42f-4747-abfc-cbee34847d81","Type":"ContainerStarted","Data":"ed8e6b21ac635d97170fa1cce478ff1f3fcde7832d1eb2cd335c00aa8f9d26d9"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.937536 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.941099 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.941153 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54468f9998-5pr6c" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.947059 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" event={"ID":"6b5f798a-8be3-4c12-948b-4b9ff35d14ba","Type":"ContainerStarted","Data":"b982d51dfdecd3607af6557bb8511e4ed53ed159b31032637306a4996dcae3b4"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.947732 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.951176 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.955876 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" podStartSLOduration=4.158572626 podStartE2EDuration="36.955854867s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.116462094 +0000 UTC m=+1724.961423145" lastFinishedPulling="2025-12-03 12:42:48.913744325 +0000 UTC m=+1757.758705386" observedRunningTime="2025-12-03 12:42:50.950428871 +0000 UTC m=+1759.795389922" watchObservedRunningTime="2025-12-03 12:42:50.955854867 +0000 UTC m=+1759.800815918" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.971535 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" event={"ID":"e4686a7d-808f-47e8-b5cd-ec3af299a7f2","Type":"ContainerStarted","Data":"97ad616221be88f02bf16afb568c486cd844850ae6936956d5e7c32e218042cc"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.974272 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d49kn" podStartSLOduration=3.416177583 podStartE2EDuration="36.974255704s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.052334932 +0000 UTC m=+1724.897295983" lastFinishedPulling="2025-12-03 12:42:49.610413053 +0000 UTC m=+1758.455374104" observedRunningTime="2025-12-03 12:42:50.973052712 +0000 UTC m=+1759.818013763" watchObservedRunningTime="2025-12-03 12:42:50.974255704 +0000 UTC m=+1759.819216755" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.974724 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" event={"ID":"72fef244-af95-4c84-889b-04317e2f85e4","Type":"ContainerStarted","Data":"4e79440e77459a755c50b4f22add407e38a4f7dda134e9abc63fe7ce0ddf4826"} Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.974872 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.981039 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" Dec 03 12:42:50 crc kubenswrapper[4666]: I1203 12:42:50.984497 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" event={"ID":"d90e913f-9878-4644-b0f7-d0e313b8f897","Type":"ContainerStarted","Data":"2c6de1406f618efda8d31188c37b890047f08f37ff0149d407b3ca8a38b98051"} Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.010876 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-j6dh9" podStartSLOduration=3.885480467 podStartE2EDuration="38.010832532s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.509520249 +0000 UTC m=+1724.354481300" lastFinishedPulling="2025-12-03 12:42:49.634872314 +0000 UTC m=+1758.479833365" observedRunningTime="2025-12-03 12:42:50.996583047 +0000 UTC m=+1759.841544098" watchObservedRunningTime="2025-12-03 12:42:51.010832532 +0000 UTC m=+1759.855793593" Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.038945 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-c4plc" podStartSLOduration=4.163100387 podStartE2EDuration="38.038926511s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.773655644 +0000 UTC m=+1724.618616695" lastFinishedPulling="2025-12-03 12:42:49.649481768 +0000 UTC m=+1758.494442819" observedRunningTime="2025-12-03 12:42:51.024592594 +0000 UTC m=+1759.869553635" watchObservedRunningTime="2025-12-03 12:42:51.038926511 +0000 UTC m=+1759.883887562" Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.100810 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-x2zqw" podStartSLOduration=3.602118716 podStartE2EDuration="37.100792842s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.111812329 +0000 UTC m=+1724.956773380" lastFinishedPulling="2025-12-03 12:42:49.610486445 +0000 UTC m=+1758.455447506" observedRunningTime="2025-12-03 12:42:51.071414619 +0000 UTC m=+1759.916375670" watchObservedRunningTime="2025-12-03 12:42:51.100792842 +0000 UTC m=+1759.945753893" Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.211389 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp8qs" podStartSLOduration=4.351552277 podStartE2EDuration="38.211368869s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.753860429 +0000 UTC m=+1724.598821480" lastFinishedPulling="2025-12-03 12:42:49.613677021 +0000 UTC m=+1758.458638072" observedRunningTime="2025-12-03 12:42:51.157578886 +0000 UTC m=+1760.002539937" watchObservedRunningTime="2025-12-03 12:42:51.211368869 +0000 UTC m=+1760.056329930" Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.993556 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" event={"ID":"1d364f72-b379-4591-b3f4-17997cbcba6e","Type":"ContainerStarted","Data":"19fd2e53554d1876855fd6b5ab424f6e2583b65348a9c05c0d900e5cccea19c5"} Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.994344 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.995547 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" event={"ID":"e4686a7d-808f-47e8-b5cd-ec3af299a7f2","Type":"ContainerStarted","Data":"5c9c61b755fb959a201a7015c91bfde8b67e005db04efd3b3a1ea300817137b9"} Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.997756 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" event={"ID":"78813232-79b6-4483-86cb-069995914531","Type":"ContainerStarted","Data":"50515e3a6fadc5b97ea5bd19f22b3dab8ce276da11c6dcab13266b45ae0a6063"} Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.997989 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" Dec 03 12:42:51 crc kubenswrapper[4666]: I1203 12:42:51.999807 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" event={"ID":"21ae197d-ae5d-4129-b1db-114a42dc5eb8","Type":"ContainerStarted","Data":"a0103d183bb0b526300532c3daeb780dd3f07067caf714ced1e716b6136a9a20"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.000043 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.001687 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" event={"ID":"8048d3e0-a035-4a85-92ad-ca11dc24ccbe","Type":"ContainerStarted","Data":"9210fcec941c9013833501b625d6594f4df99c94ec5d7ca41034d54006833c63"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.001906 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.003639 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" event={"ID":"5a914e37-4302-4c77-8d4b-6c509dfbfc4e","Type":"ContainerStarted","Data":"3bda38864535984ab8506c8236cf42278769ac4cd75625952f02d273aefbb7d5"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.003905 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.005612 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" event={"ID":"1d9a62f9-0c20-4033-84d4-ade04922d04a","Type":"ContainerStarted","Data":"b7b33e9b99e2dd962c1bbdf937b36dfd52f1a2312d3c8aa0abc23b868d589471"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.005770 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.005898 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.007351 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" event={"ID":"b3582f8c-2777-4291-bc6a-42953fd2d928","Type":"ContainerStarted","Data":"77353030ffc27ec9f31435319c1ee60fbdb870e53cec3ebe4e7fc36f1d9c582a"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.007868 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.009436 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" event={"ID":"e9197948-361b-43e7-8cc6-db509c80c7b1","Type":"ContainerStarted","Data":"6b9b972b231bc750c5070b11f888b324f1d37ec3a1d90e3cee2816802c145301"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.009835 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.011669 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" event={"ID":"330ae135-611a-4ae6-ba73-fcb6a911c299","Type":"ContainerStarted","Data":"0f9c802b5c923b1c14c119ba552d7a319a78eeafe57fd28654161948b61b3546"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.011858 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.013410 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" event={"ID":"d90e913f-9878-4644-b0f7-d0e313b8f897","Type":"ContainerStarted","Data":"225e283caa16f9cde21c8edf2305f183767b1441257b36feeb7a534605d3d58d"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.013568 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.014646 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.015317 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" event={"ID":"dce8c65f-3951-4e68-a044-c4c59638fd05","Type":"ContainerStarted","Data":"efa1e0d415f013b59ee78a72d7477b02a79f7447786693b67c02df2c98583787"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.015362 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.016987 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.017328 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" event={"ID":"deefb3d8-d96a-4e86-839d-8d8a561f4645","Type":"ContainerStarted","Data":"4316871591fecee4b0a77497c9d33bb67e5eee49e61010e5f548092b5d4ab8e4"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.017895 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.019680 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" event={"ID":"39681ef6-2d50-4509-a81e-d6cd102695cd","Type":"ContainerStarted","Data":"2c6800503a8413b8b53835765f65707dc26befbe39239a4a402b23a023c21d5d"} Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.020820 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.027780 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" podStartSLOduration=4.564641866 podStartE2EDuration="38.027766333s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.055373544 +0000 UTC m=+1724.900334595" lastFinishedPulling="2025-12-03 12:42:49.518497991 +0000 UTC m=+1758.363459062" observedRunningTime="2025-12-03 12:42:52.024217627 +0000 UTC m=+1760.869178678" watchObservedRunningTime="2025-12-03 12:42:52.027766333 +0000 UTC m=+1760.872727384" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.054009 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" podStartSLOduration=4.627992267 podStartE2EDuration="38.053982771s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.020976205 +0000 UTC m=+1724.865937246" lastFinishedPulling="2025-12-03 12:42:49.446966699 +0000 UTC m=+1758.291927750" observedRunningTime="2025-12-03 12:42:52.047478075 +0000 UTC m=+1760.892439126" watchObservedRunningTime="2025-12-03 12:42:52.053982771 +0000 UTC m=+1760.898943822" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.073131 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" podStartSLOduration=2.271779649 podStartE2EDuration="38.073108487s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.776990734 +0000 UTC m=+1724.621951785" lastFinishedPulling="2025-12-03 12:42:51.578319572 +0000 UTC m=+1760.423280623" observedRunningTime="2025-12-03 12:42:52.06803455 +0000 UTC m=+1760.912995601" watchObservedRunningTime="2025-12-03 12:42:52.073108487 +0000 UTC m=+1760.918069538" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.091432 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" podStartSLOduration=29.203147883 podStartE2EDuration="39.091411282s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:39.557678402 +0000 UTC m=+1748.402639453" lastFinishedPulling="2025-12-03 12:42:49.445941791 +0000 UTC m=+1758.290902852" observedRunningTime="2025-12-03 12:42:52.089983393 +0000 UTC m=+1760.934944464" watchObservedRunningTime="2025-12-03 12:42:52.091411282 +0000 UTC m=+1760.936372333" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.122846 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" podStartSLOduration=5.725645285 podStartE2EDuration="39.122827641s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.049744102 +0000 UTC m=+1724.894705153" lastFinishedPulling="2025-12-03 12:42:49.446926458 +0000 UTC m=+1758.291887509" observedRunningTime="2025-12-03 12:42:52.117755984 +0000 UTC m=+1760.962717035" watchObservedRunningTime="2025-12-03 12:42:52.122827641 +0000 UTC m=+1760.967788692" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.148551 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-97svz" podStartSLOduration=5.523401783 podStartE2EDuration="39.148533875s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.009609498 +0000 UTC m=+1724.854570549" lastFinishedPulling="2025-12-03 12:42:49.63474159 +0000 UTC m=+1758.479702641" observedRunningTime="2025-12-03 12:42:52.147441435 +0000 UTC m=+1760.992402476" watchObservedRunningTime="2025-12-03 12:42:52.148533875 +0000 UTC m=+1760.993494926" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.171443 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" podStartSLOduration=3.43984834 podStartE2EDuration="39.171418273s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.750761476 +0000 UTC m=+1724.595722527" lastFinishedPulling="2025-12-03 12:42:51.482331409 +0000 UTC m=+1760.327292460" observedRunningTime="2025-12-03 12:42:52.166423858 +0000 UTC m=+1761.011384909" watchObservedRunningTime="2025-12-03 12:42:52.171418273 +0000 UTC m=+1761.016379324" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.195917 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" podStartSLOduration=5.045257847 podStartE2EDuration="39.195886254s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.502475349 +0000 UTC m=+1724.347436400" lastFinishedPulling="2025-12-03 12:42:49.653103756 +0000 UTC m=+1758.498064807" observedRunningTime="2025-12-03 12:42:52.185816422 +0000 UTC m=+1761.030777493" watchObservedRunningTime="2025-12-03 12:42:52.195886254 +0000 UTC m=+1761.040847305" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.217220 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" podStartSLOduration=4.744543916 podStartE2EDuration="38.21720232s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.061033037 +0000 UTC m=+1724.905994088" lastFinishedPulling="2025-12-03 12:42:49.533691441 +0000 UTC m=+1758.378652492" observedRunningTime="2025-12-03 12:42:52.208816503 +0000 UTC m=+1761.053777564" watchObservedRunningTime="2025-12-03 12:42:52.21720232 +0000 UTC m=+1761.062163371" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.240240 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-khplb" podStartSLOduration=4.612036897 podStartE2EDuration="38.240221952s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.005473586 +0000 UTC m=+1724.850434637" lastFinishedPulling="2025-12-03 12:42:49.633658641 +0000 UTC m=+1758.478619692" observedRunningTime="2025-12-03 12:42:52.237623552 +0000 UTC m=+1761.082584613" watchObservedRunningTime="2025-12-03 12:42:52.240221952 +0000 UTC m=+1761.085183003" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.266162 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bwxhd" podStartSLOduration=4.60213095 podStartE2EDuration="38.266141092s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.970188513 +0000 UTC m=+1724.815149564" lastFinishedPulling="2025-12-03 12:42:49.634198655 +0000 UTC m=+1758.479159706" observedRunningTime="2025-12-03 12:42:52.264299472 +0000 UTC m=+1761.109260523" watchObservedRunningTime="2025-12-03 12:42:52.266141092 +0000 UTC m=+1761.111102143" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.301454 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9cbjr" podStartSLOduration=5.384059098 podStartE2EDuration="39.301433415s" podCreationTimestamp="2025-12-03 12:42:13 +0000 UTC" firstStartedPulling="2025-12-03 12:42:15.760443847 +0000 UTC m=+1724.605404898" lastFinishedPulling="2025-12-03 12:42:49.677818164 +0000 UTC m=+1758.522779215" observedRunningTime="2025-12-03 12:42:52.297000176 +0000 UTC m=+1761.141961227" watchObservedRunningTime="2025-12-03 12:42:52.301433415 +0000 UTC m=+1761.146394466" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.352053 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" podStartSLOduration=4.853366766 podStartE2EDuration="38.352032532s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.061108019 +0000 UTC m=+1724.906069070" lastFinishedPulling="2025-12-03 12:42:49.559773785 +0000 UTC m=+1758.404734836" observedRunningTime="2025-12-03 12:42:52.325477475 +0000 UTC m=+1761.170438526" watchObservedRunningTime="2025-12-03 12:42:52.352032532 +0000 UTC m=+1761.196993583" Dec 03 12:42:52 crc kubenswrapper[4666]: I1203 12:42:52.358484 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-7lkjq" podStartSLOduration=4.745331288 podStartE2EDuration="38.358460306s" podCreationTimestamp="2025-12-03 12:42:14 +0000 UTC" firstStartedPulling="2025-12-03 12:42:16.020632716 +0000 UTC m=+1724.865593767" lastFinishedPulling="2025-12-03 12:42:49.633761714 +0000 UTC m=+1758.478722785" observedRunningTime="2025-12-03 12:42:52.34752366 +0000 UTC m=+1761.192484721" watchObservedRunningTime="2025-12-03 12:42:52.358460306 +0000 UTC m=+1761.203421347" Dec 03 12:42:53 crc kubenswrapper[4666]: I1203 12:42:53.027400 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" Dec 03 12:42:53 crc kubenswrapper[4666]: I1203 12:42:53.029921 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" Dec 03 12:42:53 crc kubenswrapper[4666]: I1203 12:42:53.030261 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" Dec 03 12:42:53 crc kubenswrapper[4666]: I1203 12:42:53.030637 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-jzzkf" Dec 03 12:42:54 crc kubenswrapper[4666]: I1203 12:42:54.042463 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xwp75" Dec 03 12:42:54 crc kubenswrapper[4666]: I1203 12:42:54.046919 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-vxrg7" Dec 03 12:42:55 crc kubenswrapper[4666]: I1203 12:42:55.143943 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-rt2zn" Dec 03 12:42:55 crc kubenswrapper[4666]: I1203 12:42:55.379732 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xmmjr" Dec 03 12:43:00 crc kubenswrapper[4666]: I1203 12:43:00.772579 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt" Dec 03 12:43:03 crc kubenswrapper[4666]: I1203 12:43:03.425378 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:43:03 crc kubenswrapper[4666]: E1203 12:43:03.426150 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:43:04 crc kubenswrapper[4666]: I1203 12:43:04.400258 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dn9wg" Dec 03 12:43:04 crc kubenswrapper[4666]: I1203 12:43:04.508037 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ckxvk" Dec 03 12:43:04 crc kubenswrapper[4666]: I1203 12:43:04.627335 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-n287g" Dec 03 12:43:04 crc kubenswrapper[4666]: I1203 12:43:04.683953 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5797d476c-ntgb9" Dec 03 12:43:04 crc kubenswrapper[4666]: I1203 12:43:04.866615 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-q5d7b" Dec 03 12:43:18 crc kubenswrapper[4666]: I1203 12:43:18.428526 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:43:18 crc kubenswrapper[4666]: E1203 12:43:18.430013 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:43:22 crc kubenswrapper[4666]: I1203 12:43:22.435592 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6bq9n"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.439824 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.445368 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.445399 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kht8c" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.445523 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.445606 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.455133 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6bq9n"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.492900 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4gl5p"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.495932 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.499157 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.522048 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4gl5p"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.595882 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-config\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.595961 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4f2\" (UniqueName: \"kubernetes.io/projected/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-kube-api-access-5c4f2\") pod \"dnsmasq-dns-675f4bcbfc-6bq9n\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.596020 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.596057 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvvv\" (UniqueName: \"kubernetes.io/projected/b12d50f8-ee2e-4f0d-b998-8a16085d406f-kube-api-access-sdvvv\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.596103 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-config\") pod \"dnsmasq-dns-675f4bcbfc-6bq9n\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.697394 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-config\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.697457 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4f2\" (UniqueName: \"kubernetes.io/projected/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-kube-api-access-5c4f2\") pod \"dnsmasq-dns-675f4bcbfc-6bq9n\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.697503 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.697532 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvvv\" (UniqueName: \"kubernetes.io/projected/b12d50f8-ee2e-4f0d-b998-8a16085d406f-kube-api-access-sdvvv\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.697555 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-config\") pod \"dnsmasq-dns-675f4bcbfc-6bq9n\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.698556 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.698710 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-config\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.698918 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-config\") pod \"dnsmasq-dns-675f4bcbfc-6bq9n\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.724187 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4f2\" (UniqueName: \"kubernetes.io/projected/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-kube-api-access-5c4f2\") pod \"dnsmasq-dns-675f4bcbfc-6bq9n\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.725881 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvvv\" (UniqueName: \"kubernetes.io/projected/b12d50f8-ee2e-4f0d-b998-8a16085d406f-kube-api-access-sdvvv\") pod \"dnsmasq-dns-78dd6ddcc-4gl5p\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.762562 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:22.823434 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:25.717989 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6bq9n"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:25.759258 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-htg9r"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:25.760518 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:25.788336 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-htg9r"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:25.945395 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-dns-svc\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:25.945474 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvnt\" (UniqueName: \"kubernetes.io/projected/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-kube-api-access-2tvnt\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:25.945708 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-config\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.046979 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvnt\" (UniqueName: \"kubernetes.io/projected/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-kube-api-access-2tvnt\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.047078 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-config\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.047137 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-dns-svc\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.048267 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-config\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.048397 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-dns-svc\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.082251 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvnt\" (UniqueName: \"kubernetes.io/projected/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-kube-api-access-2tvnt\") pod \"dnsmasq-dns-666b6646f7-htg9r\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.093394 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.109594 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4gl5p"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.139028 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qk8cx"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.140252 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.163574 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qk8cx"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.252835 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.252988 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv62k\" (UniqueName: \"kubernetes.io/projected/45ecec21-800f-41f4-b62e-7755102e5f1a-kube-api-access-xv62k\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.253341 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-config\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.354553 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-config\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.354938 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.354967 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv62k\" (UniqueName: \"kubernetes.io/projected/45ecec21-800f-41f4-b62e-7755102e5f1a-kube-api-access-xv62k\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.355646 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-config\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.356381 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.392740 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv62k\" (UniqueName: \"kubernetes.io/projected/45ecec21-800f-41f4-b62e-7755102e5f1a-kube-api-access-xv62k\") pod \"dnsmasq-dns-57d769cc4f-qk8cx\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.476130 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.696240 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4gl5p"] Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.703831 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6bq9n"] Dec 03 12:43:26 crc kubenswrapper[4666]: W1203 12:43:26.705157 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5f5c74_d6e2_49e6_a1da_813c6cf8948d.slice/crio-08c04f5d3190fee3d131bec67af51c9d55c028c7a0e6bddbb0a74a9c4c2adf48 WatchSource:0}: Error finding container 08c04f5d3190fee3d131bec67af51c9d55c028c7a0e6bddbb0a74a9c4c2adf48: Status 404 returned error can't find the container with id 08c04f5d3190fee3d131bec67af51c9d55c028c7a0e6bddbb0a74a9c4c2adf48 Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.705746 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.708948 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-htg9r"] Dec 03 12:43:26 crc kubenswrapper[4666]: W1203 12:43:26.712850 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cf1a8ea_35a4_485d_a22b_2a358bd0ce92.slice/crio-d664de1ed112adaf1b49f4ad7daf8b6df5eeb4d02f5dbe25f9549eb0c2c75fec WatchSource:0}: Error finding container d664de1ed112adaf1b49f4ad7daf8b6df5eeb4d02f5dbe25f9549eb0c2c75fec: Status 404 returned error can't find the container with id d664de1ed112adaf1b49f4ad7daf8b6df5eeb4d02f5dbe25f9549eb0c2c75fec Dec 03 12:43:26 crc kubenswrapper[4666]: I1203 12:43:26.959707 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qk8cx"] Dec 03 12:43:26 crc kubenswrapper[4666]: W1203 12:43:26.961858 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ecec21_800f_41f4_b62e_7755102e5f1a.slice/crio-5521889628fd3e1b35dd7de418b1377673a1690f1b58c6bebae161ca999e175a WatchSource:0}: Error finding container 5521889628fd3e1b35dd7de418b1377673a1690f1b58c6bebae161ca999e175a: Status 404 returned error can't find the container with id 5521889628fd3e1b35dd7de418b1377673a1690f1b58c6bebae161ca999e175a Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.147180 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.148723 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.153806 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.153964 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.154103 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jmqzx" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.154534 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.156420 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.156576 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.156668 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.174621 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.261226 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271294 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37ef012c-8962-43f6-9c95-5a880aa57d5a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271368 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271407 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271455 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-config-data\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271485 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271509 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7gp4\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-kube-api-access-v7gp4\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271550 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271580 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37ef012c-8962-43f6-9c95-5a880aa57d5a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271601 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271629 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.271663 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.281529 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.285155 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.285410 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.285481 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.285662 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.285822 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.285827 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xb68b" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.291365 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.303013 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.362596 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" event={"ID":"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92","Type":"ContainerStarted","Data":"d664de1ed112adaf1b49f4ad7daf8b6df5eeb4d02f5dbe25f9549eb0c2c75fec"} Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.364704 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" event={"ID":"45ecec21-800f-41f4-b62e-7755102e5f1a","Type":"ContainerStarted","Data":"5521889628fd3e1b35dd7de418b1377673a1690f1b58c6bebae161ca999e175a"} Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.368390 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" event={"ID":"b12d50f8-ee2e-4f0d-b998-8a16085d406f","Type":"ContainerStarted","Data":"34a2468075b8d6dc0608a8cfb9fc9d0ae6a206db4e988e598f83fba8b39c64bd"} Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.373153 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" event={"ID":"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d","Type":"ContainerStarted","Data":"08c04f5d3190fee3d131bec67af51c9d55c028c7a0e6bddbb0a74a9c4c2adf48"} Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377530 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377582 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377643 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377697 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377730 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whbmw\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-kube-api-access-whbmw\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377750 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377778 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-config-data\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377803 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377832 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377871 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377888 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7gp4\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-kube-api-access-v7gp4\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377914 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377943 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377966 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ae1478-c8e5-4175-bf32-f96a34996999-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.377986 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.378012 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37ef012c-8962-43f6-9c95-5a880aa57d5a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.378038 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.378059 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.378079 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ae1478-c8e5-4175-bf32-f96a34996999-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.378125 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.378159 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.378694 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.380338 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.380861 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.380972 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37ef012c-8962-43f6-9c95-5a880aa57d5a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.381040 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.381452 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.381726 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-config-data\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.386258 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.387157 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.392510 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37ef012c-8962-43f6-9c95-5a880aa57d5a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.398208 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37ef012c-8962-43f6-9c95-5a880aa57d5a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.407497 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7gp4\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-kube-api-access-v7gp4\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.416473 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482260 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482323 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ae1478-c8e5-4175-bf32-f96a34996999-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482354 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482389 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ae1478-c8e5-4175-bf32-f96a34996999-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482417 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482459 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482505 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482532 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whbmw\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-kube-api-access-whbmw\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482559 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482590 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.482622 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.483371 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.483355 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.483450 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.483707 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.484192 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.484743 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.486547 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ae1478-c8e5-4175-bf32-f96a34996999-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.486784 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.486961 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.488505 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.503498 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ae1478-c8e5-4175-bf32-f96a34996999-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.504031 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whbmw\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-kube-api-access-whbmw\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.517545 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.613273 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.889282 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:43:27 crc kubenswrapper[4666]: W1203 12:43:27.893202 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ae1478_c8e5_4175_bf32_f96a34996999.slice/crio-edfe886cb548f638548646264377e416e66f2b4d59fa609709d07d0e79c3f088 WatchSource:0}: Error finding container edfe886cb548f638548646264377e416e66f2b4d59fa609709d07d0e79c3f088: Status 404 returned error can't find the container with id edfe886cb548f638548646264377e416e66f2b4d59fa609709d07d0e79c3f088 Dec 03 12:43:27 crc kubenswrapper[4666]: I1203 12:43:27.958917 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:43:27 crc kubenswrapper[4666]: W1203 12:43:27.962599 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ef012c_8962_43f6_9c95_5a880aa57d5a.slice/crio-f83c3755957db5418fd4963f1cb6afc786cbf8d020deb5cc2f67586c3b756754 WatchSource:0}: Error finding container f83c3755957db5418fd4963f1cb6afc786cbf8d020deb5cc2f67586c3b756754: Status 404 returned error can't find the container with id f83c3755957db5418fd4963f1cb6afc786cbf8d020deb5cc2f67586c3b756754 Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.390342 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37ef012c-8962-43f6-9c95-5a880aa57d5a","Type":"ContainerStarted","Data":"f83c3755957db5418fd4963f1cb6afc786cbf8d020deb5cc2f67586c3b756754"} Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.396283 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"49ae1478-c8e5-4175-bf32-f96a34996999","Type":"ContainerStarted","Data":"edfe886cb548f638548646264377e416e66f2b4d59fa609709d07d0e79c3f088"} Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.439250 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.440939 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.464247 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.466228 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.466451 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.467506 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.468244 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.469314 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-d57c6" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497535 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497589 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a43416-3214-46ba-8a00-6939bb265c8a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497672 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/43a43416-3214-46ba-8a00-6939bb265c8a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497720 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497761 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-kolla-config\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497821 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a43416-3214-46ba-8a00-6939bb265c8a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497873 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-config-data-default\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.497934 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2st\" (UniqueName: \"kubernetes.io/projected/43a43416-3214-46ba-8a00-6939bb265c8a-kube-api-access-mb2st\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600076 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a43416-3214-46ba-8a00-6939bb265c8a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600163 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-config-data-default\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600214 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2st\" (UniqueName: \"kubernetes.io/projected/43a43416-3214-46ba-8a00-6939bb265c8a-kube-api-access-mb2st\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600254 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600277 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a43416-3214-46ba-8a00-6939bb265c8a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600303 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/43a43416-3214-46ba-8a00-6939bb265c8a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600324 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.600353 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-kolla-config\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.601509 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-kolla-config\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.602280 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-config-data-default\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.604029 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a43416-3214-46ba-8a00-6939bb265c8a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.604669 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/43a43416-3214-46ba-8a00-6939bb265c8a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.604844 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.620860 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a43416-3214-46ba-8a00-6939bb265c8a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.621810 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a43416-3214-46ba-8a00-6939bb265c8a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.623338 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2st\" (UniqueName: \"kubernetes.io/projected/43a43416-3214-46ba-8a00-6939bb265c8a-kube-api-access-mb2st\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.625709 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"43a43416-3214-46ba-8a00-6939bb265c8a\") " pod="openstack/openstack-galera-0" Dec 03 12:43:28 crc kubenswrapper[4666]: I1203 12:43:28.784630 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.214475 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.406577 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"43a43416-3214-46ba-8a00-6939bb265c8a","Type":"ContainerStarted","Data":"f81e5662b79c1b44616be0659f0330b162fceee30a8509ac7baac1266f451f69"} Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.837414 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.843150 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.852851 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.854929 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9sktq" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.855185 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.858312 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.871484 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965353 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965427 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d17da10a-51b4-4ed7-8bbc-2b37be248419-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965535 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965566 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17da10a-51b4-4ed7-8bbc-2b37be248419-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965672 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965745 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7pbd\" (UniqueName: \"kubernetes.io/projected/d17da10a-51b4-4ed7-8bbc-2b37be248419-kube-api-access-d7pbd\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965810 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:29 crc kubenswrapper[4666]: I1203 12:43:29.965864 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d17da10a-51b4-4ed7-8bbc-2b37be248419-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.077978 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.078096 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7pbd\" (UniqueName: \"kubernetes.io/projected/d17da10a-51b4-4ed7-8bbc-2b37be248419-kube-api-access-d7pbd\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.078156 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.078206 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d17da10a-51b4-4ed7-8bbc-2b37be248419-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.078230 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.078245 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d17da10a-51b4-4ed7-8bbc-2b37be248419-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.078304 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.078336 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17da10a-51b4-4ed7-8bbc-2b37be248419-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.079442 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d17da10a-51b4-4ed7-8bbc-2b37be248419-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.081591 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.082152 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.082729 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.083136 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d17da10a-51b4-4ed7-8bbc-2b37be248419-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.088510 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d17da10a-51b4-4ed7-8bbc-2b37be248419-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.104941 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d17da10a-51b4-4ed7-8bbc-2b37be248419-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.124999 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7pbd\" (UniqueName: \"kubernetes.io/projected/d17da10a-51b4-4ed7-8bbc-2b37be248419-kube-api-access-d7pbd\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.129190 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d17da10a-51b4-4ed7-8bbc-2b37be248419\") " pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.186125 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.187188 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.190270 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.190648 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6gzqx" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.192520 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.192642 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.203780 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.282244 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.282296 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpsqh\" (UniqueName: \"kubernetes.io/projected/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-kube-api-access-hpsqh\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.282321 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-config-data\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.282372 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.282395 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-kolla-config\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.383668 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.383713 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpsqh\" (UniqueName: \"kubernetes.io/projected/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-kube-api-access-hpsqh\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.383734 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-config-data\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.383790 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.383807 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-kolla-config\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.384645 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-kolla-config\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.385185 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-config-data\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.390412 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.405551 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.410854 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpsqh\" (UniqueName: \"kubernetes.io/projected/e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53-kube-api-access-hpsqh\") pod \"memcached-0\" (UID: \"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53\") " pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.537262 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 12:43:30 crc kubenswrapper[4666]: I1203 12:43:30.853052 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.156213 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 12:43:31 crc kubenswrapper[4666]: W1203 12:43:31.245889 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4cbcde8_77f8_47d9_b6c6_c52f5daf4f53.slice/crio-d78a2e79ba15743f9d124eee27619b0cfa859e120f37ab629c384796eae95fa5 WatchSource:0}: Error finding container d78a2e79ba15743f9d124eee27619b0cfa859e120f37ab629c384796eae95fa5: Status 404 returned error can't find the container with id d78a2e79ba15743f9d124eee27619b0cfa859e120f37ab629c384796eae95fa5 Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.514883 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53","Type":"ContainerStarted","Data":"d78a2e79ba15743f9d124eee27619b0cfa859e120f37ab629c384796eae95fa5"} Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.539758 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d17da10a-51b4-4ed7-8bbc-2b37be248419","Type":"ContainerStarted","Data":"8afc9548d9e7a696d28521f5eb50dfc16c3118a0ddafe4b04bd4d3c707003e54"} Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.869398 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.874982 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.880244 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8zqjm" Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.886983 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:43:31 crc kubenswrapper[4666]: I1203 12:43:31.936639 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pht\" (UniqueName: \"kubernetes.io/projected/ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74-kube-api-access-67pht\") pod \"kube-state-metrics-0\" (UID: \"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74\") " pod="openstack/kube-state-metrics-0" Dec 03 12:43:32 crc kubenswrapper[4666]: I1203 12:43:32.037884 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pht\" (UniqueName: \"kubernetes.io/projected/ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74-kube-api-access-67pht\") pod \"kube-state-metrics-0\" (UID: \"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74\") " pod="openstack/kube-state-metrics-0" Dec 03 12:43:32 crc kubenswrapper[4666]: I1203 12:43:32.078277 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pht\" (UniqueName: \"kubernetes.io/projected/ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74-kube-api-access-67pht\") pod \"kube-state-metrics-0\" (UID: \"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74\") " pod="openstack/kube-state-metrics-0" Dec 03 12:43:32 crc kubenswrapper[4666]: I1203 12:43:32.209929 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 12:43:32 crc kubenswrapper[4666]: I1203 12:43:32.913895 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:43:33 crc kubenswrapper[4666]: I1203 12:43:33.424544 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:43:33 crc kubenswrapper[4666]: E1203 12:43:33.424785 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:43:33 crc kubenswrapper[4666]: I1203 12:43:33.600550 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74","Type":"ContainerStarted","Data":"9c9d12921ce0e191ed32472380d32cca4f6bb0ba1fdafbc6c4c5b433d6eecf46"} Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.629543 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nsf9r"] Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.635337 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.638680 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5kn6w" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.638704 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.640559 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.669407 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nsf9r"] Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.682846 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-mdtd4"] Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.684662 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.708121 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mdtd4"] Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716698 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-log-ovn\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716738 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-ovn-controller-tls-certs\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716769 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-run\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716787 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d005e60-fcd2-4546-a783-e4770dd9e1d5-scripts\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716802 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-scripts\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716855 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-run-ovn\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716871 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-log\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716892 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-688l7\" (UniqueName: \"kubernetes.io/projected/2d005e60-fcd2-4546-a783-e4770dd9e1d5-kube-api-access-688l7\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716907 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjbz6\" (UniqueName: \"kubernetes.io/projected/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-kube-api-access-fjbz6\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716948 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-etc-ovs\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.716981 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-lib\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.717008 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-run\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.717079 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-combined-ca-bundle\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818768 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-etc-ovs\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818838 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-lib\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818873 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-run\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818907 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-combined-ca-bundle\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818940 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-log-ovn\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818956 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-ovn-controller-tls-certs\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818974 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-run\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.818988 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d005e60-fcd2-4546-a783-e4770dd9e1d5-scripts\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.819000 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-scripts\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.819036 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-run-ovn\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.819058 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-log\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.819080 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-688l7\" (UniqueName: \"kubernetes.io/projected/2d005e60-fcd2-4546-a783-e4770dd9e1d5-kube-api-access-688l7\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.819111 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjbz6\" (UniqueName: \"kubernetes.io/projected/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-kube-api-access-fjbz6\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.819826 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-log-ovn\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.819981 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-lib\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.820183 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-etc-ovs\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.821248 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-run\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.821374 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-log\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.821431 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d005e60-fcd2-4546-a783-e4770dd9e1d5-var-run\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.821456 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-var-run-ovn\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.823359 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-scripts\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.823417 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d005e60-fcd2-4546-a783-e4770dd9e1d5-scripts\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.830367 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-combined-ca-bundle\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.838543 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-ovn-controller-tls-certs\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.838932 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjbz6\" (UniqueName: \"kubernetes.io/projected/e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342-kube-api-access-fjbz6\") pod \"ovn-controller-nsf9r\" (UID: \"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342\") " pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.854775 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-688l7\" (UniqueName: \"kubernetes.io/projected/2d005e60-fcd2-4546-a783-e4770dd9e1d5-kube-api-access-688l7\") pod \"ovn-controller-ovs-mdtd4\" (UID: \"2d005e60-fcd2-4546-a783-e4770dd9e1d5\") " pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:35 crc kubenswrapper[4666]: I1203 12:43:35.972519 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nsf9r" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.003730 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.059112 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.060387 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.063063 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.065588 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.073043 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.073287 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-58fr6" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.074460 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.075150 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.123757 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.123855 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55083d6a-bded-48e2-a0ce-3befa24ce873-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.123874 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55083d6a-bded-48e2-a0ce-3befa24ce873-config\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.124001 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55083d6a-bded-48e2-a0ce-3befa24ce873-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.124020 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vcsp\" (UniqueName: \"kubernetes.io/projected/55083d6a-bded-48e2-a0ce-3befa24ce873-kube-api-access-7vcsp\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.124045 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.124066 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.124107 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.225748 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.225803 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.225888 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55083d6a-bded-48e2-a0ce-3befa24ce873-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.225905 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55083d6a-bded-48e2-a0ce-3befa24ce873-config\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.225940 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55083d6a-bded-48e2-a0ce-3befa24ce873-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.225956 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vcsp\" (UniqueName: \"kubernetes.io/projected/55083d6a-bded-48e2-a0ce-3befa24ce873-kube-api-access-7vcsp\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.225979 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.226002 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.227305 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55083d6a-bded-48e2-a0ce-3befa24ce873-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.227704 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55083d6a-bded-48e2-a0ce-3befa24ce873-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.227968 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55083d6a-bded-48e2-a0ce-3befa24ce873-config\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.228026 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.232988 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.234062 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.244546 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55083d6a-bded-48e2-a0ce-3befa24ce873-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.250934 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vcsp\" (UniqueName: \"kubernetes.io/projected/55083d6a-bded-48e2-a0ce-3befa24ce873-kube-api-access-7vcsp\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.255242 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"55083d6a-bded-48e2-a0ce-3befa24ce873\") " pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:36 crc kubenswrapper[4666]: I1203 12:43:36.395547 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.616199 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.618408 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.625994 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.626556 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-wsp4s" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.626702 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.626926 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.647439 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.702893 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.703006 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.703221 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-config\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.703340 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.703412 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crdz2\" (UniqueName: \"kubernetes.io/projected/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-kube-api-access-crdz2\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.703529 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.703619 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.703719 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805219 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-config\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805675 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805700 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crdz2\" (UniqueName: \"kubernetes.io/projected/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-kube-api-access-crdz2\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805731 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805759 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805789 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805849 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.805898 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.806325 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.806643 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.807347 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.807682 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-config\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.815010 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.815010 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.815768 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.825703 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crdz2\" (UniqueName: \"kubernetes.io/projected/3bdd6a24-e604-459e-8eba-ea0d2638fdf5-kube-api-access-crdz2\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.836629 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bdd6a24-e604-459e-8eba-ea0d2638fdf5\") " pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:39 crc kubenswrapper[4666]: I1203 12:43:39.960974 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 12:43:45 crc kubenswrapper[4666]: I1203 12:43:45.423517 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:43:45 crc kubenswrapper[4666]: E1203 12:43:45.424646 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:43:56 crc kubenswrapper[4666]: I1203 12:43:56.423309 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:43:56 crc kubenswrapper[4666]: E1203 12:43:56.424496 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.403892 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.404573 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7pbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(d17da10a-51b4-4ed7-8bbc-2b37be248419): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.405942 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="d17da10a-51b4-4ed7-8bbc-2b37be248419" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.475611 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.475819 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mb2st,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(43a43416-3214-46ba-8a00-6939bb265c8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.478419 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="43a43416-3214-46ba-8a00-6939bb265c8a" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.490771 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.492251 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7gp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(37ef012c-8962-43f6-9c95-5a880aa57d5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.493478 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.856944 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="43a43416-3214-46ba-8a00-6939bb265c8a" Dec 03 12:43:59 crc kubenswrapper[4666]: E1203 12:43:59.857324 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="d17da10a-51b4-4ed7-8bbc-2b37be248419" Dec 03 12:44:04 crc kubenswrapper[4666]: I1203 12:44:04.468922 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nsf9r"] Dec 03 12:44:04 crc kubenswrapper[4666]: I1203 12:44:04.698584 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.203729 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.204268 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5c4f2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6bq9n_openstack(cd5f5c74-d6e2-49e6-a1da-813c6cf8948d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.205602 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" podUID="cd5f5c74-d6e2-49e6-a1da-813c6cf8948d" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.219608 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.219789 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdvvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4gl5p_openstack(b12d50f8-ee2e-4f0d-b998-8a16085d406f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.221018 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" podUID="b12d50f8-ee2e-4f0d-b998-8a16085d406f" Dec 03 12:44:05 crc kubenswrapper[4666]: W1203 12:44:05.526796 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0d2b1d4_5613_4ee7_a95d_ba0bc47b8342.slice/crio-95c3a1acc4d65f63ef32dce6065e6d318d0a24054fa1eec9b4557b2c776fae3b WatchSource:0}: Error finding container 95c3a1acc4d65f63ef32dce6065e6d318d0a24054fa1eec9b4557b2c776fae3b: Status 404 returned error can't find the container with id 95c3a1acc4d65f63ef32dce6065e6d318d0a24054fa1eec9b4557b2c776fae3b Dec 03 12:44:05 crc kubenswrapper[4666]: W1203 12:44:05.529952 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55083d6a_bded_48e2_a0ce_3befa24ce873.slice/crio-5e454f4c21f0135109f9485efda85e2481cbcf38edd582f7b2ec2922e819d7ff WatchSource:0}: Error finding container 5e454f4c21f0135109f9485efda85e2481cbcf38edd582f7b2ec2922e819d7ff: Status 404 returned error can't find the container with id 5e454f4c21f0135109f9485efda85e2481cbcf38edd582f7b2ec2922e819d7ff Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.594213 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.594974 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xv62k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-qk8cx_openstack(45ecec21-800f-41f4-b62e-7755102e5f1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.596244 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" podUID="45ecec21-800f-41f4-b62e-7755102e5f1a" Dec 03 12:44:05 crc kubenswrapper[4666]: I1203 12:44:05.602073 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mdtd4"] Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.644780 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.645525 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tvnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-htg9r_openstack(0cf1a8ea-35a4-485d-a22b-2a358bd0ce92): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.647032 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" podUID="0cf1a8ea-35a4-485d-a22b-2a358bd0ce92" Dec 03 12:44:05 crc kubenswrapper[4666]: I1203 12:44:05.903730 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55083d6a-bded-48e2-a0ce-3befa24ce873","Type":"ContainerStarted","Data":"5e454f4c21f0135109f9485efda85e2481cbcf38edd582f7b2ec2922e819d7ff"} Dec 03 12:44:05 crc kubenswrapper[4666]: I1203 12:44:05.905192 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nsf9r" event={"ID":"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342","Type":"ContainerStarted","Data":"95c3a1acc4d65f63ef32dce6065e6d318d0a24054fa1eec9b4557b2c776fae3b"} Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.907681 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" podUID="45ecec21-800f-41f4-b62e-7755102e5f1a" Dec 03 12:44:05 crc kubenswrapper[4666]: E1203 12:44:05.908510 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" podUID="0cf1a8ea-35a4-485d-a22b-2a358bd0ce92" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.156238 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 12:44:06 crc kubenswrapper[4666]: W1203 12:44:06.471852 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d005e60_fcd2_4546_a783_e4770dd9e1d5.slice/crio-3b98c5c4e67613fbd7c7e54b1f92e7233fe3e162f5d787bbf4bdc22b514af7d5 WatchSource:0}: Error finding container 3b98c5c4e67613fbd7c7e54b1f92e7233fe3e162f5d787bbf4bdc22b514af7d5: Status 404 returned error can't find the container with id 3b98c5c4e67613fbd7c7e54b1f92e7233fe3e162f5d787bbf4bdc22b514af7d5 Dec 03 12:44:06 crc kubenswrapper[4666]: E1203 12:44:06.473090 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 12:44:06 crc kubenswrapper[4666]: E1203 12:44:06.473183 4666 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 12:44:06 crc kubenswrapper[4666]: E1203 12:44:06.473348 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-67pht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 12:44:06 crc kubenswrapper[4666]: E1203 12:44:06.474778 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" Dec 03 12:44:06 crc kubenswrapper[4666]: W1203 12:44:06.487802 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bdd6a24_e604_459e_8eba_ea0d2638fdf5.slice/crio-c8e05c6f70f4c363bc934aff13627ade4198c9261b82aa0710ac86923661ff30 WatchSource:0}: Error finding container c8e05c6f70f4c363bc934aff13627ade4198c9261b82aa0710ac86923661ff30: Status 404 returned error can't find the container with id c8e05c6f70f4c363bc934aff13627ade4198c9261b82aa0710ac86923661ff30 Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.754380 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.782303 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.913476 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mdtd4" event={"ID":"2d005e60-fcd2-4546-a783-e4770dd9e1d5","Type":"ContainerStarted","Data":"3b98c5c4e67613fbd7c7e54b1f92e7233fe3e162f5d787bbf4bdc22b514af7d5"} Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.914569 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3bdd6a24-e604-459e-8eba-ea0d2638fdf5","Type":"ContainerStarted","Data":"c8e05c6f70f4c363bc934aff13627ade4198c9261b82aa0710ac86923661ff30"} Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.915571 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" event={"ID":"b12d50f8-ee2e-4f0d-b998-8a16085d406f","Type":"ContainerDied","Data":"34a2468075b8d6dc0608a8cfb9fc9d0ae6a206db4e988e598f83fba8b39c64bd"} Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.915674 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4gl5p" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.919602 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.919645 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6bq9n" event={"ID":"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d","Type":"ContainerDied","Data":"08c04f5d3190fee3d131bec67af51c9d55c028c7a0e6bddbb0a74a9c4c2adf48"} Dec 03 12:44:06 crc kubenswrapper[4666]: E1203 12:44:06.924729 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.934299 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-dns-svc\") pod \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.934401 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-config\") pod \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.935041 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-config" (OuterVolumeSpecName: "config") pod "cd5f5c74-d6e2-49e6-a1da-813c6cf8948d" (UID: "cd5f5c74-d6e2-49e6-a1da-813c6cf8948d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.935240 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b12d50f8-ee2e-4f0d-b998-8a16085d406f" (UID: "b12d50f8-ee2e-4f0d-b998-8a16085d406f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.935914 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdvvv\" (UniqueName: \"kubernetes.io/projected/b12d50f8-ee2e-4f0d-b998-8a16085d406f-kube-api-access-sdvvv\") pod \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.935954 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-config\") pod \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\" (UID: \"b12d50f8-ee2e-4f0d-b998-8a16085d406f\") " Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.936625 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-config" (OuterVolumeSpecName: "config") pod "b12d50f8-ee2e-4f0d-b998-8a16085d406f" (UID: "b12d50f8-ee2e-4f0d-b998-8a16085d406f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.936742 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c4f2\" (UniqueName: \"kubernetes.io/projected/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-kube-api-access-5c4f2\") pod \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\" (UID: \"cd5f5c74-d6e2-49e6-a1da-813c6cf8948d\") " Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.937846 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.937879 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.937889 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d50f8-ee2e-4f0d-b998-8a16085d406f-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.944407 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-kube-api-access-5c4f2" (OuterVolumeSpecName: "kube-api-access-5c4f2") pod "cd5f5c74-d6e2-49e6-a1da-813c6cf8948d" (UID: "cd5f5c74-d6e2-49e6-a1da-813c6cf8948d"). InnerVolumeSpecName "kube-api-access-5c4f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:06 crc kubenswrapper[4666]: I1203 12:44:06.946999 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12d50f8-ee2e-4f0d-b998-8a16085d406f-kube-api-access-sdvvv" (OuterVolumeSpecName: "kube-api-access-sdvvv") pod "b12d50f8-ee2e-4f0d-b998-8a16085d406f" (UID: "b12d50f8-ee2e-4f0d-b998-8a16085d406f"). InnerVolumeSpecName "kube-api-access-sdvvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.039700 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdvvv\" (UniqueName: \"kubernetes.io/projected/b12d50f8-ee2e-4f0d-b998-8a16085d406f-kube-api-access-sdvvv\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.039751 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c4f2\" (UniqueName: \"kubernetes.io/projected/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d-kube-api-access-5c4f2\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.328603 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6bq9n"] Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.337288 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6bq9n"] Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.357698 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4gl5p"] Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.363310 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4gl5p"] Dec 03 12:44:07 crc kubenswrapper[4666]: E1203 12:44:07.385611 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12d50f8_ee2e_4f0d_b998_8a16085d406f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5f5c74_d6e2_49e6_a1da_813c6cf8948d.slice\": RecentStats: unable to find data in memory cache]" Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.438427 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12d50f8-ee2e-4f0d-b998-8a16085d406f" path="/var/lib/kubelet/pods/b12d50f8-ee2e-4f0d-b998-8a16085d406f/volumes" Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.439387 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5f5c74-d6e2-49e6-a1da-813c6cf8948d" path="/var/lib/kubelet/pods/cd5f5c74-d6e2-49e6-a1da-813c6cf8948d/volumes" Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.932959 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53","Type":"ContainerStarted","Data":"34fabd944202f53998d2cc0d10888c8e8c7294f4358731806297eb05ff893545"} Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.933130 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 12:44:07 crc kubenswrapper[4666]: I1203 12:44:07.969773 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=4.080690162 podStartE2EDuration="37.969740276s" podCreationTimestamp="2025-12-03 12:43:30 +0000 UTC" firstStartedPulling="2025-12-03 12:43:31.256659957 +0000 UTC m=+1800.101621008" lastFinishedPulling="2025-12-03 12:44:05.145710041 +0000 UTC m=+1833.990671122" observedRunningTime="2025-12-03 12:44:07.968932674 +0000 UTC m=+1836.813893755" watchObservedRunningTime="2025-12-03 12:44:07.969740276 +0000 UTC m=+1836.814701337" Dec 03 12:44:08 crc kubenswrapper[4666]: I1203 12:44:08.423911 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:44:08 crc kubenswrapper[4666]: E1203 12:44:08.424276 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:44:08 crc kubenswrapper[4666]: I1203 12:44:08.942932 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"49ae1478-c8e5-4175-bf32-f96a34996999","Type":"ContainerStarted","Data":"fca316e7ded99ec6b82b57df1075d0e8bf2f795f283750a00ec6b65878f7525c"} Dec 03 12:44:08 crc kubenswrapper[4666]: I1203 12:44:08.947702 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37ef012c-8962-43f6-9c95-5a880aa57d5a","Type":"ContainerStarted","Data":"3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279"} Dec 03 12:44:10 crc kubenswrapper[4666]: I1203 12:44:10.966579 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mdtd4" event={"ID":"2d005e60-fcd2-4546-a783-e4770dd9e1d5","Type":"ContainerStarted","Data":"02ca61ea0d1a55fb26d1b664938624307dbc4311593615fb2510ad8aceaa7608"} Dec 03 12:44:10 crc kubenswrapper[4666]: I1203 12:44:10.972237 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55083d6a-bded-48e2-a0ce-3befa24ce873","Type":"ContainerStarted","Data":"38303876c7662d8a57cdaf67710a293ce278985493049dac4078c61bad4609df"} Dec 03 12:44:10 crc kubenswrapper[4666]: I1203 12:44:10.974066 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3bdd6a24-e604-459e-8eba-ea0d2638fdf5","Type":"ContainerStarted","Data":"7d3ab98ef07ca6d20dc3f976c02cff908c90ec8f1f1a845e9ddbb8881f617d16"} Dec 03 12:44:11 crc kubenswrapper[4666]: I1203 12:44:11.987576 4666 generic.go:334] "Generic (PLEG): container finished" podID="2d005e60-fcd2-4546-a783-e4770dd9e1d5" containerID="02ca61ea0d1a55fb26d1b664938624307dbc4311593615fb2510ad8aceaa7608" exitCode=0 Dec 03 12:44:11 crc kubenswrapper[4666]: I1203 12:44:11.987703 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mdtd4" event={"ID":"2d005e60-fcd2-4546-a783-e4770dd9e1d5","Type":"ContainerDied","Data":"02ca61ea0d1a55fb26d1b664938624307dbc4311593615fb2510ad8aceaa7608"} Dec 03 12:44:11 crc kubenswrapper[4666]: I1203 12:44:11.989631 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nsf9r" event={"ID":"e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342","Type":"ContainerStarted","Data":"d0e1db6f1d95c1df8758a59b63fe8181647d1411e83ea60c2bfed415884c7351"} Dec 03 12:44:11 crc kubenswrapper[4666]: I1203 12:44:11.992121 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nsf9r" Dec 03 12:44:12 crc kubenswrapper[4666]: I1203 12:44:12.045400 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nsf9r" podStartSLOduration=31.978574581 podStartE2EDuration="37.045377669s" podCreationTimestamp="2025-12-03 12:43:35 +0000 UTC" firstStartedPulling="2025-12-03 12:44:05.53401162 +0000 UTC m=+1834.378972681" lastFinishedPulling="2025-12-03 12:44:10.600814718 +0000 UTC m=+1839.445775769" observedRunningTime="2025-12-03 12:44:12.039712186 +0000 UTC m=+1840.884673247" watchObservedRunningTime="2025-12-03 12:44:12.045377669 +0000 UTC m=+1840.890338740" Dec 03 12:44:12 crc kubenswrapper[4666]: I1203 12:44:12.999226 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d17da10a-51b4-4ed7-8bbc-2b37be248419","Type":"ContainerStarted","Data":"d995074598b71846c845effc2446301aa8e1fced6954964d47f3650d424a61d3"} Dec 03 12:44:13 crc kubenswrapper[4666]: I1203 12:44:13.001350 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mdtd4" event={"ID":"2d005e60-fcd2-4546-a783-e4770dd9e1d5","Type":"ContainerStarted","Data":"112a49b158cc24d846e9c60a83ce6ed1640daa39b8e9c56fbe107e7ce2d50b33"} Dec 03 12:44:14 crc kubenswrapper[4666]: I1203 12:44:14.014825 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mdtd4" event={"ID":"2d005e60-fcd2-4546-a783-e4770dd9e1d5","Type":"ContainerStarted","Data":"d331f7a61fe201a32b0696146ff2388e29552e6ebf2c363b089959ea82c58832"} Dec 03 12:44:14 crc kubenswrapper[4666]: I1203 12:44:14.015342 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:44:14 crc kubenswrapper[4666]: I1203 12:44:14.015361 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:44:14 crc kubenswrapper[4666]: I1203 12:44:14.044249 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-mdtd4" podStartSLOduration=34.939275409 podStartE2EDuration="39.044222804s" podCreationTimestamp="2025-12-03 12:43:35 +0000 UTC" firstStartedPulling="2025-12-03 12:44:06.484876726 +0000 UTC m=+1835.329837797" lastFinishedPulling="2025-12-03 12:44:10.589824101 +0000 UTC m=+1839.434785192" observedRunningTime="2025-12-03 12:44:14.043905675 +0000 UTC m=+1842.888866716" watchObservedRunningTime="2025-12-03 12:44:14.044222804 +0000 UTC m=+1842.889183865" Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.031781 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"43a43416-3214-46ba-8a00-6939bb265c8a","Type":"ContainerStarted","Data":"7bf9d32f48700f39be59795142b9ac716d55bea8b6882575ed86aab5f481593c"} Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.035658 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55083d6a-bded-48e2-a0ce-3befa24ce873","Type":"ContainerStarted","Data":"a2ff7b1e38e565725ef81059d77085643b4e85f3e6b4cb4938c6604521625fc6"} Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.038521 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3bdd6a24-e604-459e-8eba-ea0d2638fdf5","Type":"ContainerStarted","Data":"b65f54e7839c6d916bcba4ccb0ce7f5bf90e181f83a3b4e1fd93ef13e5bfd977"} Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.093516 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=31.605164035 podStartE2EDuration="40.093471007s" podCreationTimestamp="2025-12-03 12:43:35 +0000 UTC" firstStartedPulling="2025-12-03 12:44:05.533663651 +0000 UTC m=+1834.378624712" lastFinishedPulling="2025-12-03 12:44:14.021970623 +0000 UTC m=+1842.866931684" observedRunningTime="2025-12-03 12:44:15.086861678 +0000 UTC m=+1843.931822729" watchObservedRunningTime="2025-12-03 12:44:15.093471007 +0000 UTC m=+1843.938432058" Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.107869 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=29.561131128 podStartE2EDuration="37.107850515s" podCreationTimestamp="2025-12-03 12:43:38 +0000 UTC" firstStartedPulling="2025-12-03 12:44:06.501491625 +0000 UTC m=+1835.346452686" lastFinishedPulling="2025-12-03 12:44:14.048211002 +0000 UTC m=+1842.893172073" observedRunningTime="2025-12-03 12:44:15.107477535 +0000 UTC m=+1843.952438606" watchObservedRunningTime="2025-12-03 12:44:15.107850515 +0000 UTC m=+1843.952811556" Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.397039 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.451354 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.541244 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 12:44:15 crc kubenswrapper[4666]: I1203 12:44:15.961162 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.013757 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.050895 4666 generic.go:334] "Generic (PLEG): container finished" podID="d17da10a-51b4-4ed7-8bbc-2b37be248419" containerID="d995074598b71846c845effc2446301aa8e1fced6954964d47f3650d424a61d3" exitCode=0 Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.051050 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d17da10a-51b4-4ed7-8bbc-2b37be248419","Type":"ContainerDied","Data":"d995074598b71846c845effc2446301aa8e1fced6954964d47f3650d424a61d3"} Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.051574 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.051710 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.097791 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.104428 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.490052 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-htg9r"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.539916 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-l9wjt"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.541546 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.544461 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.550174 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-bhvxh"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.551967 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.554601 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.576079 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-bhvxh"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580037 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580069 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-config\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580102 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46js\" (UniqueName: \"kubernetes.io/projected/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-kube-api-access-h46js\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580159 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7d6m\" (UniqueName: \"kubernetes.io/projected/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-kube-api-access-t7d6m\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580210 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580267 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-ovs-rundir\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580303 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-config\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580319 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-ovn-rundir\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580594 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.580646 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-combined-ca-bundle\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.585158 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l9wjt"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.639272 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qk8cx"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.676739 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-452xr"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.680727 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.681833 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.681872 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-config\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.681891 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h46js\" (UniqueName: \"kubernetes.io/projected/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-kube-api-access-h46js\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.681913 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7d6m\" (UniqueName: \"kubernetes.io/projected/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-kube-api-access-t7d6m\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.681941 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.681965 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-ovs-rundir\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.681986 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-config\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.682002 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-ovn-rundir\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.682058 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.682104 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-combined-ca-bundle\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.686264 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.687692 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-ovs-rundir\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.687919 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-ovn-rundir\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.693011 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-config\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.695484 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.695721 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-config\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.695805 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.707134 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-combined-ca-bundle\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.710484 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-452xr"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.725702 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.734698 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7d6m\" (UniqueName: \"kubernetes.io/projected/844dc007-fbd5-4ca4-9f2f-dc3f2382a653-kube-api-access-t7d6m\") pod \"ovn-controller-metrics-l9wjt\" (UID: \"844dc007-fbd5-4ca4-9f2f-dc3f2382a653\") " pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.735625 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h46js\" (UniqueName: \"kubernetes.io/projected/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-kube-api-access-h46js\") pod \"dnsmasq-dns-7f896c8c65-bhvxh\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.781253 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.783334 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pw9n\" (UniqueName: \"kubernetes.io/projected/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-kube-api-access-2pw9n\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.783393 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.783430 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.783462 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-config\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.783520 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.789539 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.792710 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6lpzd" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.792875 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.793044 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.793214 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.809006 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.874455 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l9wjt" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.881635 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885149 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885249 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9fa52f-aa3b-4705-8a71-e48befc92571-scripts\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885281 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pw9n\" (UniqueName: \"kubernetes.io/projected/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-kube-api-access-2pw9n\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885316 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885347 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885386 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e9fa52f-aa3b-4705-8a71-e48befc92571-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885441 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42m8k\" (UniqueName: \"kubernetes.io/projected/8e9fa52f-aa3b-4705-8a71-e48befc92571-kube-api-access-42m8k\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885474 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885499 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885534 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885579 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-config\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.885602 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9fa52f-aa3b-4705-8a71-e48befc92571-config\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.886480 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.886492 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-config\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.886744 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.887007 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.905165 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pw9n\" (UniqueName: \"kubernetes.io/projected/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-kube-api-access-2pw9n\") pod \"dnsmasq-dns-86db49b7ff-452xr\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986212 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986269 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e9fa52f-aa3b-4705-8a71-e48befc92571-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986307 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42m8k\" (UniqueName: \"kubernetes.io/projected/8e9fa52f-aa3b-4705-8a71-e48befc92571-kube-api-access-42m8k\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986336 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986368 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986404 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9fa52f-aa3b-4705-8a71-e48befc92571-config\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986485 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9fa52f-aa3b-4705-8a71-e48befc92571-scripts\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.986745 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e9fa52f-aa3b-4705-8a71-e48befc92571-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:16 crc kubenswrapper[4666]: I1203 12:44:16.987436 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9fa52f-aa3b-4705-8a71-e48befc92571-scripts\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.063620 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.068744 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d17da10a-51b4-4ed7-8bbc-2b37be248419","Type":"ContainerStarted","Data":"b758ae9942d713986d56e7b2b0371f9687314b74968bc0cc8393492dec7a7646"} Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.097539 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.102324 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.104648 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e9fa52f-aa3b-4705-8a71-e48befc92571-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.106359 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9fa52f-aa3b-4705-8a71-e48befc92571-config\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.114856 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42m8k\" (UniqueName: \"kubernetes.io/projected/8e9fa52f-aa3b-4705-8a71-e48befc92571-kube-api-access-42m8k\") pod \"ovn-northd-0\" (UID: \"8e9fa52f-aa3b-4705-8a71-e48befc92571\") " pod="openstack/ovn-northd-0" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.244972 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.257350 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.274355 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.25069391 podStartE2EDuration="49.274337238s" podCreationTimestamp="2025-12-03 12:43:28 +0000 UTC" firstStartedPulling="2025-12-03 12:43:30.908941254 +0000 UTC m=+1799.753902305" lastFinishedPulling="2025-12-03 12:44:11.932584572 +0000 UTC m=+1840.777545633" observedRunningTime="2025-12-03 12:44:17.102830285 +0000 UTC m=+1845.947791366" watchObservedRunningTime="2025-12-03 12:44:17.274337238 +0000 UTC m=+1846.119298289" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.395879 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-config\") pod \"45ecec21-800f-41f4-b62e-7755102e5f1a\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.395979 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-config\") pod \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.396063 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-dns-svc\") pod \"45ecec21-800f-41f4-b62e-7755102e5f1a\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.396106 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv62k\" (UniqueName: \"kubernetes.io/projected/45ecec21-800f-41f4-b62e-7755102e5f1a-kube-api-access-xv62k\") pod \"45ecec21-800f-41f4-b62e-7755102e5f1a\" (UID: \"45ecec21-800f-41f4-b62e-7755102e5f1a\") " Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.396143 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-dns-svc\") pod \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.396205 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvnt\" (UniqueName: \"kubernetes.io/projected/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-kube-api-access-2tvnt\") pod \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\" (UID: \"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92\") " Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.396561 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-config" (OuterVolumeSpecName: "config") pod "0cf1a8ea-35a4-485d-a22b-2a358bd0ce92" (UID: "0cf1a8ea-35a4-485d-a22b-2a358bd0ce92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.396614 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45ecec21-800f-41f4-b62e-7755102e5f1a" (UID: "45ecec21-800f-41f4-b62e-7755102e5f1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.396578 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-config" (OuterVolumeSpecName: "config") pod "45ecec21-800f-41f4-b62e-7755102e5f1a" (UID: "45ecec21-800f-41f4-b62e-7755102e5f1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.400483 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ecec21-800f-41f4-b62e-7755102e5f1a-kube-api-access-xv62k" (OuterVolumeSpecName: "kube-api-access-xv62k") pod "45ecec21-800f-41f4-b62e-7755102e5f1a" (UID: "45ecec21-800f-41f4-b62e-7755102e5f1a"). InnerVolumeSpecName "kube-api-access-xv62k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.401972 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0cf1a8ea-35a4-485d-a22b-2a358bd0ce92" (UID: "0cf1a8ea-35a4-485d-a22b-2a358bd0ce92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.407112 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.433468 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-kube-api-access-2tvnt" (OuterVolumeSpecName: "kube-api-access-2tvnt") pod "0cf1a8ea-35a4-485d-a22b-2a358bd0ce92" (UID: "0cf1a8ea-35a4-485d-a22b-2a358bd0ce92"). InnerVolumeSpecName "kube-api-access-2tvnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.497870 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tvnt\" (UniqueName: \"kubernetes.io/projected/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-kube-api-access-2tvnt\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.497901 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.497911 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.497919 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ecec21-800f-41f4-b62e-7755102e5f1a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.497928 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv62k\" (UniqueName: \"kubernetes.io/projected/45ecec21-800f-41f4-b62e-7755102e5f1a-kube-api-access-xv62k\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.497939 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.583505 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l9wjt"] Dec 03 12:44:17 crc kubenswrapper[4666]: E1203 12:44:17.638821 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cf1a8ea_35a4_485d_a22b_2a358bd0ce92.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cf1a8ea_35a4_485d_a22b_2a358bd0ce92.slice/crio-d664de1ed112adaf1b49f4ad7daf8b6df5eeb4d02f5dbe25f9549eb0c2c75fec\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ecec21_800f_41f4_b62e_7755102e5f1a.slice/crio-5521889628fd3e1b35dd7de418b1377673a1690f1b58c6bebae161ca999e175a\": RecentStats: unable to find data in memory cache]" Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.680429 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-452xr"] Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.697057 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 12:44:17 crc kubenswrapper[4666]: I1203 12:44:17.738616 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-bhvxh"] Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.077685 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l9wjt" event={"ID":"844dc007-fbd5-4ca4-9f2f-dc3f2382a653","Type":"ContainerStarted","Data":"316adefd8dc177740f3e7af16002e223eeb5dd4a88cc1878091e3cae45c45a89"} Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.077735 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l9wjt" event={"ID":"844dc007-fbd5-4ca4-9f2f-dc3f2382a653","Type":"ContainerStarted","Data":"92e975f36c0d143fd2e1f72c53ea6a3ef0181e558a4c6b292f86d90940b185ad"} Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.080728 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" event={"ID":"0cf1a8ea-35a4-485d-a22b-2a358bd0ce92","Type":"ContainerDied","Data":"d664de1ed112adaf1b49f4ad7daf8b6df5eeb4d02f5dbe25f9549eb0c2c75fec"} Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.080806 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-htg9r" Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.083358 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e9fa52f-aa3b-4705-8a71-e48befc92571","Type":"ContainerStarted","Data":"1241ddfbe677015ba99d24d46be182eb1fb08deaf98fa104c2f04f4a25a43585"} Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.085591 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" event={"ID":"45ecec21-800f-41f4-b62e-7755102e5f1a","Type":"ContainerDied","Data":"5521889628fd3e1b35dd7de418b1377673a1690f1b58c6bebae161ca999e175a"} Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.085649 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qk8cx" Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.087899 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" event={"ID":"bd2bc7f2-09f4-46d1-8640-183260d1ccb8","Type":"ContainerStarted","Data":"4195758383b42be222c809edc2b05500cea3ad929772eb4062e4675a467d41a4"} Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.090011 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" event={"ID":"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7","Type":"ContainerStarted","Data":"e99301092cdf0ef6de4f720e69e35c88bcbcb99a5be952b8f0872fda022d294a"} Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.102807 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-l9wjt" podStartSLOduration=2.102784837 podStartE2EDuration="2.102784837s" podCreationTimestamp="2025-12-03 12:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:18.099504368 +0000 UTC m=+1846.944465429" watchObservedRunningTime="2025-12-03 12:44:18.102784837 +0000 UTC m=+1846.947745888" Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.179890 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qk8cx"] Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.188183 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qk8cx"] Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.221281 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-htg9r"] Dec 03 12:44:18 crc kubenswrapper[4666]: I1203 12:44:18.228882 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-htg9r"] Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.099810 4666 generic.go:334] "Generic (PLEG): container finished" podID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerID="d709e0ec0cae2052756127a4a3343cc8dac3fcdf3469c2dd5ddbdd52d4e0c4e9" exitCode=0 Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.099921 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" event={"ID":"bd2bc7f2-09f4-46d1-8640-183260d1ccb8","Type":"ContainerDied","Data":"d709e0ec0cae2052756127a4a3343cc8dac3fcdf3469c2dd5ddbdd52d4e0c4e9"} Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.103480 4666 generic.go:334] "Generic (PLEG): container finished" podID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerID="a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605" exitCode=0 Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.103696 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" event={"ID":"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7","Type":"ContainerDied","Data":"a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605"} Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.106696 4666 generic.go:334] "Generic (PLEG): container finished" podID="43a43416-3214-46ba-8a00-6939bb265c8a" containerID="7bf9d32f48700f39be59795142b9ac716d55bea8b6882575ed86aab5f481593c" exitCode=0 Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.106754 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"43a43416-3214-46ba-8a00-6939bb265c8a","Type":"ContainerDied","Data":"7bf9d32f48700f39be59795142b9ac716d55bea8b6882575ed86aab5f481593c"} Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.109890 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e9fa52f-aa3b-4705-8a71-e48befc92571","Type":"ContainerStarted","Data":"89f59280eece0289a954f9c4d7ff4f6ccd6ee3f6b42a0aa1adcb9aacea82e65c"} Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.423639 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:44:19 crc kubenswrapper[4666]: E1203 12:44:19.424018 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.433131 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf1a8ea-35a4-485d-a22b-2a358bd0ce92" path="/var/lib/kubelet/pods/0cf1a8ea-35a4-485d-a22b-2a358bd0ce92/volumes" Dec 03 12:44:19 crc kubenswrapper[4666]: I1203 12:44:19.433812 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ecec21-800f-41f4-b62e-7755102e5f1a" path="/var/lib/kubelet/pods/45ecec21-800f-41f4-b62e-7755102e5f1a/volumes" Dec 03 12:44:20 crc kubenswrapper[4666]: I1203 12:44:20.193433 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 12:44:20 crc kubenswrapper[4666]: I1203 12:44:20.193481 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.147704 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"43a43416-3214-46ba-8a00-6939bb265c8a","Type":"ContainerStarted","Data":"c1e6aae0a922fe611ecd4ae4a7d22cb7957ca009beaa6ed54e5a671979b75a01"} Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.150673 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e9fa52f-aa3b-4705-8a71-e48befc92571","Type":"ContainerStarted","Data":"ee7e5af0e0b19ddaf6a6c88442cc84733feac92c5de46468e0487ff297099e2d"} Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.150793 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.152267 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74","Type":"ContainerStarted","Data":"f6bda6c9e80596ed08358252b619f5ef36bb71435df79dd92adbe4e4abbb4789"} Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.152489 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.154333 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" event={"ID":"bd2bc7f2-09f4-46d1-8640-183260d1ccb8","Type":"ContainerStarted","Data":"c5c8364662d2aebcba8d5b3979f64fd1000e8c23732dbf7831ae8e92355f840a"} Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.155293 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.158607 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" event={"ID":"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7","Type":"ContainerStarted","Data":"27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202"} Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.158741 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.166285 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371980.688505 podStartE2EDuration="56.166269994s" podCreationTimestamp="2025-12-03 12:43:27 +0000 UTC" firstStartedPulling="2025-12-03 12:43:29.220579096 +0000 UTC m=+1798.065540137" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:23.165437992 +0000 UTC m=+1852.010399043" watchObservedRunningTime="2025-12-03 12:44:23.166269994 +0000 UTC m=+1852.011231045" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.181901 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.184920 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" podStartSLOduration=6.693492323 podStartE2EDuration="7.184909478s" podCreationTimestamp="2025-12-03 12:44:16 +0000 UTC" firstStartedPulling="2025-12-03 12:44:17.749206095 +0000 UTC m=+1846.594167146" lastFinishedPulling="2025-12-03 12:44:18.24062325 +0000 UTC m=+1847.085584301" observedRunningTime="2025-12-03 12:44:23.18461214 +0000 UTC m=+1852.029573191" watchObservedRunningTime="2025-12-03 12:44:23.184909478 +0000 UTC m=+1852.029870529" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.201626 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" podStartSLOduration=6.646611377 podStartE2EDuration="7.201610399s" podCreationTimestamp="2025-12-03 12:44:16 +0000 UTC" firstStartedPulling="2025-12-03 12:44:17.684529728 +0000 UTC m=+1846.529490779" lastFinishedPulling="2025-12-03 12:44:18.23952875 +0000 UTC m=+1847.084489801" observedRunningTime="2025-12-03 12:44:23.199291636 +0000 UTC m=+1852.044252717" watchObservedRunningTime="2025-12-03 12:44:23.201610399 +0000 UTC m=+1852.046571450" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.230936 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.249887308 podStartE2EDuration="52.23091638s" podCreationTimestamp="2025-12-03 12:43:31 +0000 UTC" firstStartedPulling="2025-12-03 12:43:32.963807551 +0000 UTC m=+1801.808768602" lastFinishedPulling="2025-12-03 12:44:22.944836613 +0000 UTC m=+1851.789797674" observedRunningTime="2025-12-03 12:44:23.219343548 +0000 UTC m=+1852.064304609" watchObservedRunningTime="2025-12-03 12:44:23.23091638 +0000 UTC m=+1852.075877421" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.243756 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=6.042544469 podStartE2EDuration="7.243736557s" podCreationTimestamp="2025-12-03 12:44:16 +0000 UTC" firstStartedPulling="2025-12-03 12:44:17.691047314 +0000 UTC m=+1846.536008365" lastFinishedPulling="2025-12-03 12:44:18.892239402 +0000 UTC m=+1847.737200453" observedRunningTime="2025-12-03 12:44:23.242991477 +0000 UTC m=+1852.087952538" watchObservedRunningTime="2025-12-03 12:44:23.243736557 +0000 UTC m=+1852.088697608" Dec 03 12:44:23 crc kubenswrapper[4666]: I1203 12:44:23.274515 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.065274 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.155007 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-bhvxh"] Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.155495 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" podUID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerName="dnsmasq-dns" containerID="cri-o://27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202" gracePeriod=10 Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.156277 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.637217 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.680568 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-config\") pod \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.680646 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-dns-svc\") pod \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.680731 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h46js\" (UniqueName: \"kubernetes.io/projected/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-kube-api-access-h46js\") pod \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.680787 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-ovsdbserver-sb\") pod \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\" (UID: \"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7\") " Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.693003 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-kube-api-access-h46js" (OuterVolumeSpecName: "kube-api-access-h46js") pod "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" (UID: "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7"). InnerVolumeSpecName "kube-api-access-h46js". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.726587 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" (UID: "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.731086 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" (UID: "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.734965 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-config" (OuterVolumeSpecName: "config") pod "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" (UID: "9feee9cf-ac44-4dc7-8fa0-24474c59c3d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.782063 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h46js\" (UniqueName: \"kubernetes.io/projected/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-kube-api-access-h46js\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.782091 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.782101 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:27 crc kubenswrapper[4666]: I1203 12:44:27.782122 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.230961 4666 generic.go:334] "Generic (PLEG): container finished" podID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerID="27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202" exitCode=0 Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.231054 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.231033 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" event={"ID":"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7","Type":"ContainerDied","Data":"27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202"} Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.231458 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-bhvxh" event={"ID":"9feee9cf-ac44-4dc7-8fa0-24474c59c3d7","Type":"ContainerDied","Data":"e99301092cdf0ef6de4f720e69e35c88bcbcb99a5be952b8f0872fda022d294a"} Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.231490 4666 scope.go:117] "RemoveContainer" containerID="27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.256997 4666 scope.go:117] "RemoveContainer" containerID="a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.264933 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-bhvxh"] Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.273040 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-bhvxh"] Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.296796 4666 scope.go:117] "RemoveContainer" containerID="27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202" Dec 03 12:44:28 crc kubenswrapper[4666]: E1203 12:44:28.297293 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202\": container with ID starting with 27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202 not found: ID does not exist" containerID="27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.297323 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202"} err="failed to get container status \"27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202\": rpc error: code = NotFound desc = could not find container \"27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202\": container with ID starting with 27f21aad095c96c742160acd56961033c53942637511f4f051a9b18b5a3e1202 not found: ID does not exist" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.297344 4666 scope.go:117] "RemoveContainer" containerID="a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605" Dec 03 12:44:28 crc kubenswrapper[4666]: E1203 12:44:28.297685 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605\": container with ID starting with a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605 not found: ID does not exist" containerID="a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.297726 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605"} err="failed to get container status \"a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605\": rpc error: code = NotFound desc = could not find container \"a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605\": container with ID starting with a6d5f09377db00273cd3a2d5ecdd05e4cb5006c802fe8219866c91f91c312605 not found: ID does not exist" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.785280 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 12:44:28 crc kubenswrapper[4666]: I1203 12:44:28.785373 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 12:44:29 crc kubenswrapper[4666]: I1203 12:44:29.437147 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" path="/var/lib/kubelet/pods/9feee9cf-ac44-4dc7-8fa0-24474c59c3d7/volumes" Dec 03 12:44:31 crc kubenswrapper[4666]: I1203 12:44:31.047207 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 12:44:31 crc kubenswrapper[4666]: I1203 12:44:31.115274 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 12:44:32 crc kubenswrapper[4666]: I1203 12:44:32.218534 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 12:44:32 crc kubenswrapper[4666]: I1203 12:44:32.479022 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 12:44:34 crc kubenswrapper[4666]: I1203 12:44:34.423606 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:44:34 crc kubenswrapper[4666]: E1203 12:44:34.424494 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.795596 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f6df-account-create-update-rnxbk"] Dec 03 12:44:35 crc kubenswrapper[4666]: E1203 12:44:35.796016 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerName="dnsmasq-dns" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.796033 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerName="dnsmasq-dns" Dec 03 12:44:35 crc kubenswrapper[4666]: E1203 12:44:35.796051 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerName="init" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.796058 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerName="init" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.796291 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9feee9cf-ac44-4dc7-8fa0-24474c59c3d7" containerName="dnsmasq-dns" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.796914 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.804282 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.805491 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f6df-account-create-update-rnxbk"] Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.826143 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-operator-scripts\") pod \"glance-f6df-account-create-update-rnxbk\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.826555 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ls6p\" (UniqueName: \"kubernetes.io/projected/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-kube-api-access-7ls6p\") pod \"glance-f6df-account-create-update-rnxbk\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.837847 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-r79jt"] Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.838835 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r79jt" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.855990 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-r79jt"] Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.927835 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb83348-41a7-482e-88e1-270211aacb35-operator-scripts\") pod \"glance-db-create-r79jt\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " pod="openstack/glance-db-create-r79jt" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.927912 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqwfn\" (UniqueName: \"kubernetes.io/projected/efb83348-41a7-482e-88e1-270211aacb35-kube-api-access-qqwfn\") pod \"glance-db-create-r79jt\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " pod="openstack/glance-db-create-r79jt" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.928002 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-operator-scripts\") pod \"glance-f6df-account-create-update-rnxbk\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.928038 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ls6p\" (UniqueName: \"kubernetes.io/projected/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-kube-api-access-7ls6p\") pod \"glance-f6df-account-create-update-rnxbk\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.928756 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-operator-scripts\") pod \"glance-f6df-account-create-update-rnxbk\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:35 crc kubenswrapper[4666]: I1203 12:44:35.946728 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ls6p\" (UniqueName: \"kubernetes.io/projected/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-kube-api-access-7ls6p\") pod \"glance-f6df-account-create-update-rnxbk\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.030435 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqwfn\" (UniqueName: \"kubernetes.io/projected/efb83348-41a7-482e-88e1-270211aacb35-kube-api-access-qqwfn\") pod \"glance-db-create-r79jt\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " pod="openstack/glance-db-create-r79jt" Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.030691 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb83348-41a7-482e-88e1-270211aacb35-operator-scripts\") pod \"glance-db-create-r79jt\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " pod="openstack/glance-db-create-r79jt" Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.033731 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb83348-41a7-482e-88e1-270211aacb35-operator-scripts\") pod \"glance-db-create-r79jt\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " pod="openstack/glance-db-create-r79jt" Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.049480 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqwfn\" (UniqueName: \"kubernetes.io/projected/efb83348-41a7-482e-88e1-270211aacb35-kube-api-access-qqwfn\") pod \"glance-db-create-r79jt\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " pod="openstack/glance-db-create-r79jt" Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.126959 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.156519 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r79jt" Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.599587 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f6df-account-create-update-rnxbk"] Dec 03 12:44:36 crc kubenswrapper[4666]: W1203 12:44:36.606365 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf41bd6e_97a2_4fe8_abab_ef09475a7c9b.slice/crio-ae6049d1bb397a9ea3da240c67c83afff8a7e333d01d40e1d02dcf8c7677eefd WatchSource:0}: Error finding container ae6049d1bb397a9ea3da240c67c83afff8a7e333d01d40e1d02dcf8c7677eefd: Status 404 returned error can't find the container with id ae6049d1bb397a9ea3da240c67c83afff8a7e333d01d40e1d02dcf8c7677eefd Dec 03 12:44:36 crc kubenswrapper[4666]: I1203 12:44:36.707777 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-r79jt"] Dec 03 12:44:37 crc kubenswrapper[4666]: I1203 12:44:37.315758 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r79jt" event={"ID":"efb83348-41a7-482e-88e1-270211aacb35","Type":"ContainerStarted","Data":"07cb92bb9e11de2bfc851b993ac1843be066a205dde3c0f1cb8bbf595eac3acb"} Dec 03 12:44:37 crc kubenswrapper[4666]: I1203 12:44:37.317450 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f6df-account-create-update-rnxbk" event={"ID":"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b","Type":"ContainerStarted","Data":"ae6049d1bb397a9ea3da240c67c83afff8a7e333d01d40e1d02dcf8c7677eefd"} Dec 03 12:44:38 crc kubenswrapper[4666]: I1203 12:44:38.329051 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r79jt" event={"ID":"efb83348-41a7-482e-88e1-270211aacb35","Type":"ContainerStarted","Data":"0254f3f6217317ca9cd0c26a7ea9ebb701d8560dacccbd22066574bc43d4942b"} Dec 03 12:44:38 crc kubenswrapper[4666]: I1203 12:44:38.334212 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f6df-account-create-update-rnxbk" event={"ID":"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b","Type":"ContainerStarted","Data":"e502eef6f2b687e09154cac1cee14a1bd5bf862b71d4ffab7e8fc9b882ed81bd"} Dec 03 12:44:38 crc kubenswrapper[4666]: I1203 12:44:38.347662 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-r79jt" podStartSLOduration=3.347643569 podStartE2EDuration="3.347643569s" podCreationTimestamp="2025-12-03 12:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:38.344897095 +0000 UTC m=+1867.189858166" watchObservedRunningTime="2025-12-03 12:44:38.347643569 +0000 UTC m=+1867.192604630" Dec 03 12:44:38 crc kubenswrapper[4666]: I1203 12:44:38.371795 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f6df-account-create-update-rnxbk" podStartSLOduration=3.371772461 podStartE2EDuration="3.371772461s" podCreationTimestamp="2025-12-03 12:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:38.364754011 +0000 UTC m=+1867.209715082" watchObservedRunningTime="2025-12-03 12:44:38.371772461 +0000 UTC m=+1867.216733522" Dec 03 12:44:39 crc kubenswrapper[4666]: I1203 12:44:39.343968 4666 generic.go:334] "Generic (PLEG): container finished" podID="efb83348-41a7-482e-88e1-270211aacb35" containerID="0254f3f6217317ca9cd0c26a7ea9ebb701d8560dacccbd22066574bc43d4942b" exitCode=0 Dec 03 12:44:39 crc kubenswrapper[4666]: I1203 12:44:39.344319 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r79jt" event={"ID":"efb83348-41a7-482e-88e1-270211aacb35","Type":"ContainerDied","Data":"0254f3f6217317ca9cd0c26a7ea9ebb701d8560dacccbd22066574bc43d4942b"} Dec 03 12:44:39 crc kubenswrapper[4666]: I1203 12:44:39.346701 4666 generic.go:334] "Generic (PLEG): container finished" podID="cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" containerID="e502eef6f2b687e09154cac1cee14a1bd5bf862b71d4ffab7e8fc9b882ed81bd" exitCode=0 Dec 03 12:44:39 crc kubenswrapper[4666]: I1203 12:44:39.346769 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f6df-account-create-update-rnxbk" event={"ID":"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b","Type":"ContainerDied","Data":"e502eef6f2b687e09154cac1cee14a1bd5bf862b71d4ffab7e8fc9b882ed81bd"} Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.114181 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zzrnb"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.116124 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.120751 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zzrnb"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.209223 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42919b07-c6bf-47a5-8bc9-973574afd913-operator-scripts\") pod \"keystone-db-create-zzrnb\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.209345 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpfd\" (UniqueName: \"kubernetes.io/projected/42919b07-c6bf-47a5-8bc9-973574afd913-kube-api-access-2mpfd\") pod \"keystone-db-create-zzrnb\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.213711 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0ae4-account-create-update-hd4gw"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.216801 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.219816 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.228724 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0ae4-account-create-update-hd4gw"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.309893 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42919b07-c6bf-47a5-8bc9-973574afd913-operator-scripts\") pod \"keystone-db-create-zzrnb\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.309995 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj844\" (UniqueName: \"kubernetes.io/projected/d7cb48a9-dd01-4708-89e7-cc5702003f99-kube-api-access-jj844\") pod \"keystone-0ae4-account-create-update-hd4gw\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.310026 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpfd\" (UniqueName: \"kubernetes.io/projected/42919b07-c6bf-47a5-8bc9-973574afd913-kube-api-access-2mpfd\") pod \"keystone-db-create-zzrnb\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.310061 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7cb48a9-dd01-4708-89e7-cc5702003f99-operator-scripts\") pod \"keystone-0ae4-account-create-update-hd4gw\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.310860 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42919b07-c6bf-47a5-8bc9-973574afd913-operator-scripts\") pod \"keystone-db-create-zzrnb\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.327767 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpfd\" (UniqueName: \"kubernetes.io/projected/42919b07-c6bf-47a5-8bc9-973574afd913-kube-api-access-2mpfd\") pod \"keystone-db-create-zzrnb\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.401341 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-f7mlv"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.402355 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.412995 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f7mlv"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.414796 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-operator-scripts\") pod \"placement-db-create-f7mlv\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.414934 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj844\" (UniqueName: \"kubernetes.io/projected/d7cb48a9-dd01-4708-89e7-cc5702003f99-kube-api-access-jj844\") pod \"keystone-0ae4-account-create-update-hd4gw\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.414991 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7cb48a9-dd01-4708-89e7-cc5702003f99-operator-scripts\") pod \"keystone-0ae4-account-create-update-hd4gw\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.415040 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dtn\" (UniqueName: \"kubernetes.io/projected/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-kube-api-access-s5dtn\") pod \"placement-db-create-f7mlv\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.417124 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7cb48a9-dd01-4708-89e7-cc5702003f99-operator-scripts\") pod \"keystone-0ae4-account-create-update-hd4gw\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.445297 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj844\" (UniqueName: \"kubernetes.io/projected/d7cb48a9-dd01-4708-89e7-cc5702003f99-kube-api-access-jj844\") pod \"keystone-0ae4-account-create-update-hd4gw\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.490596 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.516351 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dtn\" (UniqueName: \"kubernetes.io/projected/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-kube-api-access-s5dtn\") pod \"placement-db-create-f7mlv\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.516441 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-operator-scripts\") pod \"placement-db-create-f7mlv\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.518605 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-operator-scripts\") pod \"placement-db-create-f7mlv\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.519571 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d9a7-account-create-update-lqv8d"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.521539 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.524186 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.527052 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9a7-account-create-update-lqv8d"] Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.533794 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dtn\" (UniqueName: \"kubernetes.io/projected/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-kube-api-access-s5dtn\") pod \"placement-db-create-f7mlv\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.542159 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.617627 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sk2p\" (UniqueName: \"kubernetes.io/projected/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-kube-api-access-2sk2p\") pod \"placement-d9a7-account-create-update-lqv8d\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.617741 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-operator-scripts\") pod \"placement-d9a7-account-create-update-lqv8d\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.719315 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-operator-scripts\") pod \"placement-d9a7-account-create-update-lqv8d\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.719401 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sk2p\" (UniqueName: \"kubernetes.io/projected/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-kube-api-access-2sk2p\") pod \"placement-d9a7-account-create-update-lqv8d\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.720426 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.720574 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-operator-scripts\") pod \"placement-d9a7-account-create-update-lqv8d\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.737866 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sk2p\" (UniqueName: \"kubernetes.io/projected/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-kube-api-access-2sk2p\") pod \"placement-d9a7-account-create-update-lqv8d\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.784820 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r79jt" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.801862 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.820236 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqwfn\" (UniqueName: \"kubernetes.io/projected/efb83348-41a7-482e-88e1-270211aacb35-kube-api-access-qqwfn\") pod \"efb83348-41a7-482e-88e1-270211aacb35\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.820346 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ls6p\" (UniqueName: \"kubernetes.io/projected/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-kube-api-access-7ls6p\") pod \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.820393 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-operator-scripts\") pod \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\" (UID: \"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b\") " Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.820551 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb83348-41a7-482e-88e1-270211aacb35-operator-scripts\") pod \"efb83348-41a7-482e-88e1-270211aacb35\" (UID: \"efb83348-41a7-482e-88e1-270211aacb35\") " Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.822414 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb83348-41a7-482e-88e1-270211aacb35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efb83348-41a7-482e-88e1-270211aacb35" (UID: "efb83348-41a7-482e-88e1-270211aacb35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.822982 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" (UID: "cf41bd6e-97a2-4fe8-abab-ef09475a7c9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.824288 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-kube-api-access-7ls6p" (OuterVolumeSpecName: "kube-api-access-7ls6p") pod "cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" (UID: "cf41bd6e-97a2-4fe8-abab-ef09475a7c9b"). InnerVolumeSpecName "kube-api-access-7ls6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.827876 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb83348-41a7-482e-88e1-270211aacb35-kube-api-access-qqwfn" (OuterVolumeSpecName: "kube-api-access-qqwfn") pod "efb83348-41a7-482e-88e1-270211aacb35" (UID: "efb83348-41a7-482e-88e1-270211aacb35"). InnerVolumeSpecName "kube-api-access-qqwfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.921836 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb83348-41a7-482e-88e1-270211aacb35-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.921867 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqwfn\" (UniqueName: \"kubernetes.io/projected/efb83348-41a7-482e-88e1-270211aacb35-kube-api-access-qqwfn\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.921878 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ls6p\" (UniqueName: \"kubernetes.io/projected/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-kube-api-access-7ls6p\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.921887 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.921994 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:40 crc kubenswrapper[4666]: I1203 12:44:40.961211 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zzrnb"] Dec 03 12:44:40 crc kubenswrapper[4666]: W1203 12:44:40.972053 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42919b07_c6bf_47a5_8bc9_973574afd913.slice/crio-a515881e5f35d1bf675950a839510739db97f208458f5a7a85745ca3a6206b02 WatchSource:0}: Error finding container a515881e5f35d1bf675950a839510739db97f208458f5a7a85745ca3a6206b02: Status 404 returned error can't find the container with id a515881e5f35d1bf675950a839510739db97f208458f5a7a85745ca3a6206b02 Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.026891 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nsf9r" podUID="e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342" containerName="ovn-controller" probeResult="failure" output=< Dec 03 12:44:41 crc kubenswrapper[4666]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 12:44:41 crc kubenswrapper[4666]: > Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.053307 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0ae4-account-create-update-hd4gw"] Dec 03 12:44:41 crc kubenswrapper[4666]: W1203 12:44:41.060149 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7cb48a9_dd01_4708_89e7_cc5702003f99.slice/crio-6fa80a137af296b7a5b059d72cab8c078ae2ec3d96460a640ee36595ce039176 WatchSource:0}: Error finding container 6fa80a137af296b7a5b059d72cab8c078ae2ec3d96460a640ee36595ce039176: Status 404 returned error can't find the container with id 6fa80a137af296b7a5b059d72cab8c078ae2ec3d96460a640ee36595ce039176 Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.143860 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f7mlv"] Dec 03 12:44:41 crc kubenswrapper[4666]: W1203 12:44:41.148267 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc33a6a53_cf96_4f2e_bc74_5aad83b6298e.slice/crio-bad793dbde71c3cbe8a8498261aa1c7d833fa6796aeb3d0abd41af68a7824ddb WatchSource:0}: Error finding container bad793dbde71c3cbe8a8498261aa1c7d833fa6796aeb3d0abd41af68a7824ddb: Status 404 returned error can't find the container with id bad793dbde71c3cbe8a8498261aa1c7d833fa6796aeb3d0abd41af68a7824ddb Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.351048 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9a7-account-create-update-lqv8d"] Dec 03 12:44:41 crc kubenswrapper[4666]: W1203 12:44:41.353252 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ec08bfb_edbf_4793_9d37_96f6e0e766d8.slice/crio-46f552b55c6184568f36477877f6e3d0cd0be633f543990af8b5c6ec1ca2dd09 WatchSource:0}: Error finding container 46f552b55c6184568f36477877f6e3d0cd0be633f543990af8b5c6ec1ca2dd09: Status 404 returned error can't find the container with id 46f552b55c6184568f36477877f6e3d0cd0be633f543990af8b5c6ec1ca2dd09 Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.365670 4666 generic.go:334] "Generic (PLEG): container finished" podID="49ae1478-c8e5-4175-bf32-f96a34996999" containerID="fca316e7ded99ec6b82b57df1075d0e8bf2f795f283750a00ec6b65878f7525c" exitCode=0 Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.365762 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"49ae1478-c8e5-4175-bf32-f96a34996999","Type":"ContainerDied","Data":"fca316e7ded99ec6b82b57df1075d0e8bf2f795f283750a00ec6b65878f7525c"} Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.367054 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f7mlv" event={"ID":"c33a6a53-cf96-4f2e-bc74-5aad83b6298e","Type":"ContainerStarted","Data":"bad793dbde71c3cbe8a8498261aa1c7d833fa6796aeb3d0abd41af68a7824ddb"} Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.378123 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zzrnb" event={"ID":"42919b07-c6bf-47a5-8bc9-973574afd913","Type":"ContainerStarted","Data":"a515881e5f35d1bf675950a839510739db97f208458f5a7a85745ca3a6206b02"} Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.382471 4666 generic.go:334] "Generic (PLEG): container finished" podID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerID="3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279" exitCode=0 Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.382888 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37ef012c-8962-43f6-9c95-5a880aa57d5a","Type":"ContainerDied","Data":"3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279"} Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.390660 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r79jt" event={"ID":"efb83348-41a7-482e-88e1-270211aacb35","Type":"ContainerDied","Data":"07cb92bb9e11de2bfc851b993ac1843be066a205dde3c0f1cb8bbf595eac3acb"} Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.390707 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07cb92bb9e11de2bfc851b993ac1843be066a205dde3c0f1cb8bbf595eac3acb" Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.390779 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r79jt" Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.398769 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0ae4-account-create-update-hd4gw" event={"ID":"d7cb48a9-dd01-4708-89e7-cc5702003f99","Type":"ContainerStarted","Data":"6fa80a137af296b7a5b059d72cab8c078ae2ec3d96460a640ee36595ce039176"} Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.402736 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f6df-account-create-update-rnxbk" event={"ID":"cf41bd6e-97a2-4fe8-abab-ef09475a7c9b","Type":"ContainerDied","Data":"ae6049d1bb397a9ea3da240c67c83afff8a7e333d01d40e1d02dcf8c7677eefd"} Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.402771 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:44:41 crc kubenswrapper[4666]: I1203 12:44:41.402786 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae6049d1bb397a9ea3da240c67c83afff8a7e333d01d40e1d02dcf8c7677eefd" Dec 03 12:44:42 crc kubenswrapper[4666]: I1203 12:44:42.495229 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9a7-account-create-update-lqv8d" event={"ID":"8ec08bfb-edbf-4793-9d37-96f6e0e766d8","Type":"ContainerStarted","Data":"46f552b55c6184568f36477877f6e3d0cd0be633f543990af8b5c6ec1ca2dd09"} Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.506007 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"49ae1478-c8e5-4175-bf32-f96a34996999","Type":"ContainerStarted","Data":"68cd18f4ae6aaff7863fe5b233fc1030b3b24abe7733723b0a0c020849b4e998"} Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.507242 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.509806 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f7mlv" event={"ID":"c33a6a53-cf96-4f2e-bc74-5aad83b6298e","Type":"ContainerStarted","Data":"4ce2428ee107d8d3fa7a280a073c2e51cf79a9381a63d235c93dd738a1da2f18"} Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.512961 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zzrnb" event={"ID":"42919b07-c6bf-47a5-8bc9-973574afd913","Type":"ContainerStarted","Data":"472bbcae1548c2ba82203d661e520d6432edac8df601cb98e5b01358ba34a357"} Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.515788 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9a7-account-create-update-lqv8d" event={"ID":"8ec08bfb-edbf-4793-9d37-96f6e0e766d8","Type":"ContainerStarted","Data":"9487932252ea3eea33b5edcb63b0f403cfd6d3109575eb0752b4e7824ead0f04"} Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.519311 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37ef012c-8962-43f6-9c95-5a880aa57d5a","Type":"ContainerStarted","Data":"dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488"} Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.519853 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.521783 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0ae4-account-create-update-hd4gw" event={"ID":"d7cb48a9-dd01-4708-89e7-cc5702003f99","Type":"ContainerStarted","Data":"130a10b959f51f0912a9673228e9dc34a296ed444a01a11fa8a2040679804694"} Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.540384 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.476416174 podStartE2EDuration="1m17.540367698s" podCreationTimestamp="2025-12-03 12:43:26 +0000 UTC" firstStartedPulling="2025-12-03 12:43:27.897162728 +0000 UTC m=+1796.742123779" lastFinishedPulling="2025-12-03 12:44:03.961114252 +0000 UTC m=+1832.806075303" observedRunningTime="2025-12-03 12:44:43.536486623 +0000 UTC m=+1872.381447694" watchObservedRunningTime="2025-12-03 12:44:43.540367698 +0000 UTC m=+1872.385328749" Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.557659 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0ae4-account-create-update-hd4gw" podStartSLOduration=3.557639425 podStartE2EDuration="3.557639425s" podCreationTimestamp="2025-12-03 12:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:43.553518043 +0000 UTC m=+1872.398479084" watchObservedRunningTime="2025-12-03 12:44:43.557639425 +0000 UTC m=+1872.402600486" Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.584433 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-f7mlv" podStartSLOduration=3.5844147079999997 podStartE2EDuration="3.584414708s" podCreationTimestamp="2025-12-03 12:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:43.570213104 +0000 UTC m=+1872.415174165" watchObservedRunningTime="2025-12-03 12:44:43.584414708 +0000 UTC m=+1872.429375759" Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.620431 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371958.234364 podStartE2EDuration="1m18.62041281s" podCreationTimestamp="2025-12-03 12:43:25 +0000 UTC" firstStartedPulling="2025-12-03 12:43:27.965289018 +0000 UTC m=+1796.810250069" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:43.60152245 +0000 UTC m=+1872.446483501" watchObservedRunningTime="2025-12-03 12:44:43.62041281 +0000 UTC m=+1872.465373881" Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.623040 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d9a7-account-create-update-lqv8d" podStartSLOduration=3.623030371 podStartE2EDuration="3.623030371s" podCreationTimestamp="2025-12-03 12:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:43.619710442 +0000 UTC m=+1872.464671483" watchObservedRunningTime="2025-12-03 12:44:43.623030371 +0000 UTC m=+1872.467991412" Dec 03 12:44:43 crc kubenswrapper[4666]: I1203 12:44:43.640241 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-zzrnb" podStartSLOduration=3.640224716 podStartE2EDuration="3.640224716s" podCreationTimestamp="2025-12-03 12:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:43.635463647 +0000 UTC m=+1872.480424698" watchObservedRunningTime="2025-12-03 12:44:43.640224716 +0000 UTC m=+1872.485185767" Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.531360 4666 generic.go:334] "Generic (PLEG): container finished" podID="8ec08bfb-edbf-4793-9d37-96f6e0e766d8" containerID="9487932252ea3eea33b5edcb63b0f403cfd6d3109575eb0752b4e7824ead0f04" exitCode=0 Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.531422 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9a7-account-create-update-lqv8d" event={"ID":"8ec08bfb-edbf-4793-9d37-96f6e0e766d8","Type":"ContainerDied","Data":"9487932252ea3eea33b5edcb63b0f403cfd6d3109575eb0752b4e7824ead0f04"} Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.533252 4666 generic.go:334] "Generic (PLEG): container finished" podID="d7cb48a9-dd01-4708-89e7-cc5702003f99" containerID="130a10b959f51f0912a9673228e9dc34a296ed444a01a11fa8a2040679804694" exitCode=0 Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.533303 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0ae4-account-create-update-hd4gw" event={"ID":"d7cb48a9-dd01-4708-89e7-cc5702003f99","Type":"ContainerDied","Data":"130a10b959f51f0912a9673228e9dc34a296ed444a01a11fa8a2040679804694"} Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.535014 4666 generic.go:334] "Generic (PLEG): container finished" podID="c33a6a53-cf96-4f2e-bc74-5aad83b6298e" containerID="4ce2428ee107d8d3fa7a280a073c2e51cf79a9381a63d235c93dd738a1da2f18" exitCode=0 Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.535057 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f7mlv" event={"ID":"c33a6a53-cf96-4f2e-bc74-5aad83b6298e","Type":"ContainerDied","Data":"4ce2428ee107d8d3fa7a280a073c2e51cf79a9381a63d235c93dd738a1da2f18"} Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.539703 4666 generic.go:334] "Generic (PLEG): container finished" podID="42919b07-c6bf-47a5-8bc9-973574afd913" containerID="472bbcae1548c2ba82203d661e520d6432edac8df601cb98e5b01358ba34a357" exitCode=0 Dec 03 12:44:44 crc kubenswrapper[4666]: I1203 12:44:44.540259 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zzrnb" event={"ID":"42919b07-c6bf-47a5-8bc9-973574afd913","Type":"ContainerDied","Data":"472bbcae1548c2ba82203d661e520d6432edac8df601cb98e5b01358ba34a357"} Dec 03 12:44:45 crc kubenswrapper[4666]: I1203 12:44:45.978477 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.077499 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nsf9r" podUID="e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342" containerName="ovn-controller" probeResult="failure" output=< Dec 03 12:44:46 crc kubenswrapper[4666]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 12:44:46 crc kubenswrapper[4666]: > Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.084735 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7ljg8"] Dec 03 12:44:46 crc kubenswrapper[4666]: E1203 12:44:46.085415 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.085641 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: E1203 12:44:46.085692 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33a6a53-cf96-4f2e-bc74-5aad83b6298e" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.085722 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33a6a53-cf96-4f2e-bc74-5aad83b6298e" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: E1203 12:44:46.085737 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb83348-41a7-482e-88e1-270211aacb35" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.085743 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb83348-41a7-482e-88e1-270211aacb35" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.085934 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33a6a53-cf96-4f2e-bc74-5aad83b6298e" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.085955 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb83348-41a7-482e-88e1-270211aacb35" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.085964 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.087216 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.091794 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7ljg8"] Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.094978 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.095335 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.096367 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mdtd4" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.096523 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b6lz5" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.135585 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-operator-scripts\") pod \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.135657 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5dtn\" (UniqueName: \"kubernetes.io/projected/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-kube-api-access-s5dtn\") pod \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\" (UID: \"c33a6a53-cf96-4f2e-bc74-5aad83b6298e\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.137831 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c33a6a53-cf96-4f2e-bc74-5aad83b6298e" (UID: "c33a6a53-cf96-4f2e-bc74-5aad83b6298e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.143380 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-kube-api-access-s5dtn" (OuterVolumeSpecName: "kube-api-access-s5dtn") pod "c33a6a53-cf96-4f2e-bc74-5aad83b6298e" (UID: "c33a6a53-cf96-4f2e-bc74-5aad83b6298e"). InnerVolumeSpecName "kube-api-access-s5dtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.191695 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.218290 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.227206 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.237331 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89h9h\" (UniqueName: \"kubernetes.io/projected/d22dc9c8-5134-46ec-ba0f-c2b334490035-kube-api-access-89h9h\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.237378 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-db-sync-config-data\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.237415 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-config-data\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.237532 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-combined-ca-bundle\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.237588 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.237600 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5dtn\" (UniqueName: \"kubernetes.io/projected/c33a6a53-cf96-4f2e-bc74-5aad83b6298e-kube-api-access-s5dtn\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.336434 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nsf9r-config-g775k"] Dec 03 12:44:46 crc kubenswrapper[4666]: E1203 12:44:46.337112 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42919b07-c6bf-47a5-8bc9-973574afd913" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.337207 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="42919b07-c6bf-47a5-8bc9-973574afd913" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: E1203 12:44:46.337328 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec08bfb-edbf-4793-9d37-96f6e0e766d8" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.337411 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec08bfb-edbf-4793-9d37-96f6e0e766d8" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: E1203 12:44:46.337524 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7cb48a9-dd01-4708-89e7-cc5702003f99" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.337599 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7cb48a9-dd01-4708-89e7-cc5702003f99" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.337876 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7cb48a9-dd01-4708-89e7-cc5702003f99" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.337975 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="42919b07-c6bf-47a5-8bc9-973574afd913" containerName="mariadb-database-create" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.338057 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec08bfb-edbf-4793-9d37-96f6e0e766d8" containerName="mariadb-account-create-update" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.338688 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-operator-scripts\") pod \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.338841 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7cb48a9-dd01-4708-89e7-cc5702003f99-operator-scripts\") pod \"d7cb48a9-dd01-4708-89e7-cc5702003f99\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.338874 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42919b07-c6bf-47a5-8bc9-973574afd913-operator-scripts\") pod \"42919b07-c6bf-47a5-8bc9-973574afd913\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.338905 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sk2p\" (UniqueName: \"kubernetes.io/projected/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-kube-api-access-2sk2p\") pod \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\" (UID: \"8ec08bfb-edbf-4793-9d37-96f6e0e766d8\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339000 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mpfd\" (UniqueName: \"kubernetes.io/projected/42919b07-c6bf-47a5-8bc9-973574afd913-kube-api-access-2mpfd\") pod \"42919b07-c6bf-47a5-8bc9-973574afd913\" (UID: \"42919b07-c6bf-47a5-8bc9-973574afd913\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339025 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj844\" (UniqueName: \"kubernetes.io/projected/d7cb48a9-dd01-4708-89e7-cc5702003f99-kube-api-access-jj844\") pod \"d7cb48a9-dd01-4708-89e7-cc5702003f99\" (UID: \"d7cb48a9-dd01-4708-89e7-cc5702003f99\") " Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339328 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-combined-ca-bundle\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339390 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89h9h\" (UniqueName: \"kubernetes.io/projected/d22dc9c8-5134-46ec-ba0f-c2b334490035-kube-api-access-89h9h\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339421 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-db-sync-config-data\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339425 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ec08bfb-edbf-4793-9d37-96f6e0e766d8" (UID: "8ec08bfb-edbf-4793-9d37-96f6e0e766d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339455 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-config-data\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339507 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42919b07-c6bf-47a5-8bc9-973574afd913-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42919b07-c6bf-47a5-8bc9-973574afd913" (UID: "42919b07-c6bf-47a5-8bc9-973574afd913"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339722 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7cb48a9-dd01-4708-89e7-cc5702003f99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7cb48a9-dd01-4708-89e7-cc5702003f99" (UID: "d7cb48a9-dd01-4708-89e7-cc5702003f99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339836 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.339877 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42919b07-c6bf-47a5-8bc9-973574afd913-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.342510 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42919b07-c6bf-47a5-8bc9-973574afd913-kube-api-access-2mpfd" (OuterVolumeSpecName: "kube-api-access-2mpfd") pod "42919b07-c6bf-47a5-8bc9-973574afd913" (UID: "42919b07-c6bf-47a5-8bc9-973574afd913"). InnerVolumeSpecName "kube-api-access-2mpfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.343205 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7cb48a9-dd01-4708-89e7-cc5702003f99-kube-api-access-jj844" (OuterVolumeSpecName: "kube-api-access-jj844") pod "d7cb48a9-dd01-4708-89e7-cc5702003f99" (UID: "d7cb48a9-dd01-4708-89e7-cc5702003f99"). InnerVolumeSpecName "kube-api-access-jj844". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.347252 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.349835 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.350933 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-kube-api-access-2sk2p" (OuterVolumeSpecName: "kube-api-access-2sk2p") pod "8ec08bfb-edbf-4793-9d37-96f6e0e766d8" (UID: "8ec08bfb-edbf-4793-9d37-96f6e0e766d8"). InnerVolumeSpecName "kube-api-access-2sk2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.354377 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-db-sync-config-data\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.355959 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nsf9r-config-g775k"] Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.356665 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-combined-ca-bundle\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.362730 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-config-data\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.373896 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89h9h\" (UniqueName: \"kubernetes.io/projected/d22dc9c8-5134-46ec-ba0f-c2b334490035-kube-api-access-89h9h\") pod \"glance-db-sync-7ljg8\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441251 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-additional-scripts\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441325 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltc9b\" (UniqueName: \"kubernetes.io/projected/f9e2df5c-33fb-4521-bc3e-563918842b4a-kube-api-access-ltc9b\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441372 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441397 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run-ovn\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441620 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-log-ovn\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441659 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-scripts\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441719 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7cb48a9-dd01-4708-89e7-cc5702003f99-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441735 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sk2p\" (UniqueName: \"kubernetes.io/projected/8ec08bfb-edbf-4793-9d37-96f6e0e766d8-kube-api-access-2sk2p\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441751 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mpfd\" (UniqueName: \"kubernetes.io/projected/42919b07-c6bf-47a5-8bc9-973574afd913-kube-api-access-2mpfd\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.441765 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj844\" (UniqueName: \"kubernetes.io/projected/d7cb48a9-dd01-4708-89e7-cc5702003f99-kube-api-access-jj844\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.515552 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7ljg8" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.543604 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.543826 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run-ovn\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.543921 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-log-ovn\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.543994 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-scripts\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.544121 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run-ovn\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.544153 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.544196 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-log-ovn\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.544267 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-additional-scripts\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.544595 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltc9b\" (UniqueName: \"kubernetes.io/projected/f9e2df5c-33fb-4521-bc3e-563918842b4a-kube-api-access-ltc9b\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.545067 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-additional-scripts\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.547707 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-scripts\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.561721 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzrnb" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.561724 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zzrnb" event={"ID":"42919b07-c6bf-47a5-8bc9-973574afd913","Type":"ContainerDied","Data":"a515881e5f35d1bf675950a839510739db97f208458f5a7a85745ca3a6206b02"} Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.561906 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a515881e5f35d1bf675950a839510739db97f208458f5a7a85745ca3a6206b02" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.562602 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltc9b\" (UniqueName: \"kubernetes.io/projected/f9e2df5c-33fb-4521-bc3e-563918842b4a-kube-api-access-ltc9b\") pod \"ovn-controller-nsf9r-config-g775k\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.564161 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9a7-account-create-update-lqv8d" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.564228 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9a7-account-create-update-lqv8d" event={"ID":"8ec08bfb-edbf-4793-9d37-96f6e0e766d8","Type":"ContainerDied","Data":"46f552b55c6184568f36477877f6e3d0cd0be633f543990af8b5c6ec1ca2dd09"} Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.565405 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f552b55c6184568f36477877f6e3d0cd0be633f543990af8b5c6ec1ca2dd09" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.568490 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0ae4-account-create-update-hd4gw" event={"ID":"d7cb48a9-dd01-4708-89e7-cc5702003f99","Type":"ContainerDied","Data":"6fa80a137af296b7a5b059d72cab8c078ae2ec3d96460a640ee36595ce039176"} Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.568532 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0ae4-account-create-update-hd4gw" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.568537 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa80a137af296b7a5b059d72cab8c078ae2ec3d96460a640ee36595ce039176" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.580087 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f7mlv" event={"ID":"c33a6a53-cf96-4f2e-bc74-5aad83b6298e","Type":"ContainerDied","Data":"bad793dbde71c3cbe8a8498261aa1c7d833fa6796aeb3d0abd41af68a7824ddb"} Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.580155 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad793dbde71c3cbe8a8498261aa1c7d833fa6796aeb3d0abd41af68a7824ddb" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.580922 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f7mlv" Dec 03 12:44:46 crc kubenswrapper[4666]: I1203 12:44:46.666243 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:47 crc kubenswrapper[4666]: I1203 12:44:47.054664 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7ljg8"] Dec 03 12:44:47 crc kubenswrapper[4666]: W1203 12:44:47.055863 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd22dc9c8_5134_46ec_ba0f_c2b334490035.slice/crio-7ea9573f6e47dbe1fd690239e96a74aa77e79b9938713a7ff5ad4b61c9ac8302 WatchSource:0}: Error finding container 7ea9573f6e47dbe1fd690239e96a74aa77e79b9938713a7ff5ad4b61c9ac8302: Status 404 returned error can't find the container with id 7ea9573f6e47dbe1fd690239e96a74aa77e79b9938713a7ff5ad4b61c9ac8302 Dec 03 12:44:47 crc kubenswrapper[4666]: I1203 12:44:47.134330 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nsf9r-config-g775k"] Dec 03 12:44:47 crc kubenswrapper[4666]: W1203 12:44:47.140821 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e2df5c_33fb_4521_bc3e_563918842b4a.slice/crio-fda12fbde7d56b7050f138fdf2bfa4c1714a16c94b6203bb5826aeb5ba398d1b WatchSource:0}: Error finding container fda12fbde7d56b7050f138fdf2bfa4c1714a16c94b6203bb5826aeb5ba398d1b: Status 404 returned error can't find the container with id fda12fbde7d56b7050f138fdf2bfa4c1714a16c94b6203bb5826aeb5ba398d1b Dec 03 12:44:47 crc kubenswrapper[4666]: I1203 12:44:47.424314 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:44:47 crc kubenswrapper[4666]: E1203 12:44:47.424864 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:44:47 crc kubenswrapper[4666]: I1203 12:44:47.592326 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nsf9r-config-g775k" event={"ID":"f9e2df5c-33fb-4521-bc3e-563918842b4a","Type":"ContainerStarted","Data":"fda12fbde7d56b7050f138fdf2bfa4c1714a16c94b6203bb5826aeb5ba398d1b"} Dec 03 12:44:47 crc kubenswrapper[4666]: I1203 12:44:47.594784 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7ljg8" event={"ID":"d22dc9c8-5134-46ec-ba0f-c2b334490035","Type":"ContainerStarted","Data":"7ea9573f6e47dbe1fd690239e96a74aa77e79b9938713a7ff5ad4b61c9ac8302"} Dec 03 12:44:49 crc kubenswrapper[4666]: I1203 12:44:49.612523 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nsf9r-config-g775k" event={"ID":"f9e2df5c-33fb-4521-bc3e-563918842b4a","Type":"ContainerStarted","Data":"a56ba1b8f1b37faccc51fcb13b9b2cf2532539e3bcb53b2c84c3dd7976bb74dd"} Dec 03 12:44:49 crc kubenswrapper[4666]: I1203 12:44:49.634179 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nsf9r-config-g775k" podStartSLOduration=3.6340727360000002 podStartE2EDuration="3.634072736s" podCreationTimestamp="2025-12-03 12:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:44:49.630036337 +0000 UTC m=+1878.474997408" watchObservedRunningTime="2025-12-03 12:44:49.634072736 +0000 UTC m=+1878.479033797" Dec 03 12:44:50 crc kubenswrapper[4666]: I1203 12:44:50.621605 4666 generic.go:334] "Generic (PLEG): container finished" podID="f9e2df5c-33fb-4521-bc3e-563918842b4a" containerID="a56ba1b8f1b37faccc51fcb13b9b2cf2532539e3bcb53b2c84c3dd7976bb74dd" exitCode=0 Dec 03 12:44:50 crc kubenswrapper[4666]: I1203 12:44:50.621669 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nsf9r-config-g775k" event={"ID":"f9e2df5c-33fb-4521-bc3e-563918842b4a","Type":"ContainerDied","Data":"a56ba1b8f1b37faccc51fcb13b9b2cf2532539e3bcb53b2c84c3dd7976bb74dd"} Dec 03 12:44:51 crc kubenswrapper[4666]: I1203 12:44:51.044165 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nsf9r" Dec 03 12:44:51 crc kubenswrapper[4666]: I1203 12:44:51.989320 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.131871 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-additional-scripts\") pod \"f9e2df5c-33fb-4521-bc3e-563918842b4a\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.131931 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-log-ovn\") pod \"f9e2df5c-33fb-4521-bc3e-563918842b4a\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.131984 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run-ovn\") pod \"f9e2df5c-33fb-4521-bc3e-563918842b4a\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132043 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run\") pod \"f9e2df5c-33fb-4521-bc3e-563918842b4a\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132053 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f9e2df5c-33fb-4521-bc3e-563918842b4a" (UID: "f9e2df5c-33fb-4521-bc3e-563918842b4a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132118 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f9e2df5c-33fb-4521-bc3e-563918842b4a" (UID: "f9e2df5c-33fb-4521-bc3e-563918842b4a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132142 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run" (OuterVolumeSpecName: "var-run") pod "f9e2df5c-33fb-4521-bc3e-563918842b4a" (UID: "f9e2df5c-33fb-4521-bc3e-563918842b4a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132134 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-scripts\") pod \"f9e2df5c-33fb-4521-bc3e-563918842b4a\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132246 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltc9b\" (UniqueName: \"kubernetes.io/projected/f9e2df5c-33fb-4521-bc3e-563918842b4a-kube-api-access-ltc9b\") pod \"f9e2df5c-33fb-4521-bc3e-563918842b4a\" (UID: \"f9e2df5c-33fb-4521-bc3e-563918842b4a\") " Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132955 4666 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132974 4666 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.132984 4666 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9e2df5c-33fb-4521-bc3e-563918842b4a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.133481 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f9e2df5c-33fb-4521-bc3e-563918842b4a" (UID: "f9e2df5c-33fb-4521-bc3e-563918842b4a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.134399 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-scripts" (OuterVolumeSpecName: "scripts") pod "f9e2df5c-33fb-4521-bc3e-563918842b4a" (UID: "f9e2df5c-33fb-4521-bc3e-563918842b4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.137894 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e2df5c-33fb-4521-bc3e-563918842b4a-kube-api-access-ltc9b" (OuterVolumeSpecName: "kube-api-access-ltc9b") pod "f9e2df5c-33fb-4521-bc3e-563918842b4a" (UID: "f9e2df5c-33fb-4521-bc3e-563918842b4a"). InnerVolumeSpecName "kube-api-access-ltc9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.234527 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.234600 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltc9b\" (UniqueName: \"kubernetes.io/projected/f9e2df5c-33fb-4521-bc3e-563918842b4a-kube-api-access-ltc9b\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.234611 4666 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e2df5c-33fb-4521-bc3e-563918842b4a-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.636726 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nsf9r-config-g775k" event={"ID":"f9e2df5c-33fb-4521-bc3e-563918842b4a","Type":"ContainerDied","Data":"fda12fbde7d56b7050f138fdf2bfa4c1714a16c94b6203bb5826aeb5ba398d1b"} Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.636772 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fda12fbde7d56b7050f138fdf2bfa4c1714a16c94b6203bb5826aeb5ba398d1b" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.636842 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nsf9r-config-g775k" Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.733723 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nsf9r-config-g775k"] Dec 03 12:44:52 crc kubenswrapper[4666]: I1203 12:44:52.739567 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nsf9r-config-g775k"] Dec 03 12:44:53 crc kubenswrapper[4666]: I1203 12:44:53.436179 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e2df5c-33fb-4521-bc3e-563918842b4a" path="/var/lib/kubelet/pods/f9e2df5c-33fb-4521-bc3e-563918842b4a/volumes" Dec 03 12:44:57 crc kubenswrapper[4666]: I1203 12:44:57.489243 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Dec 03 12:44:57 crc kubenswrapper[4666]: I1203 12:44:57.615818 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Dec 03 12:44:59 crc kubenswrapper[4666]: I1203 12:44:59.424374 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:44:59 crc kubenswrapper[4666]: E1203 12:44:59.425052 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.136579 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs"] Dec 03 12:45:00 crc kubenswrapper[4666]: E1203 12:45:00.141428 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e2df5c-33fb-4521-bc3e-563918842b4a" containerName="ovn-config" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.141455 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e2df5c-33fb-4521-bc3e-563918842b4a" containerName="ovn-config" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.141752 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e2df5c-33fb-4521-bc3e-563918842b4a" containerName="ovn-config" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.142365 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.142757 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs"] Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.144730 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.146216 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.278382 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9f1cc55-1f71-490b-8def-9902e96f803a-config-volume\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.278518 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfcvd\" (UniqueName: \"kubernetes.io/projected/a9f1cc55-1f71-490b-8def-9902e96f803a-kube-api-access-zfcvd\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.278567 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9f1cc55-1f71-490b-8def-9902e96f803a-secret-volume\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.379930 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9f1cc55-1f71-490b-8def-9902e96f803a-config-volume\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.380049 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfcvd\" (UniqueName: \"kubernetes.io/projected/a9f1cc55-1f71-490b-8def-9902e96f803a-kube-api-access-zfcvd\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.380075 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9f1cc55-1f71-490b-8def-9902e96f803a-secret-volume\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.380922 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9f1cc55-1f71-490b-8def-9902e96f803a-config-volume\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.385349 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9f1cc55-1f71-490b-8def-9902e96f803a-secret-volume\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.395295 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfcvd\" (UniqueName: \"kubernetes.io/projected/a9f1cc55-1f71-490b-8def-9902e96f803a-kube-api-access-zfcvd\") pod \"collect-profiles-29412765-5thhs\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:00 crc kubenswrapper[4666]: I1203 12:45:00.469974 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:03 crc kubenswrapper[4666]: I1203 12:45:03.639049 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs"] Dec 03 12:45:03 crc kubenswrapper[4666]: W1203 12:45:03.642394 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f1cc55_1f71_490b_8def_9902e96f803a.slice/crio-ba266aec2e6c4ff4714bed6705f592dbb3353ee3d1f3f82664a2b437404a2f44 WatchSource:0}: Error finding container ba266aec2e6c4ff4714bed6705f592dbb3353ee3d1f3f82664a2b437404a2f44: Status 404 returned error can't find the container with id ba266aec2e6c4ff4714bed6705f592dbb3353ee3d1f3f82664a2b437404a2f44 Dec 03 12:45:03 crc kubenswrapper[4666]: I1203 12:45:03.749043 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" event={"ID":"a9f1cc55-1f71-490b-8def-9902e96f803a","Type":"ContainerStarted","Data":"ba266aec2e6c4ff4714bed6705f592dbb3353ee3d1f3f82664a2b437404a2f44"} Dec 03 12:45:04 crc kubenswrapper[4666]: I1203 12:45:04.760543 4666 generic.go:334] "Generic (PLEG): container finished" podID="a9f1cc55-1f71-490b-8def-9902e96f803a" containerID="c07add67fbe553b507eb70160f5e6481f96a10cb8cc0f8b64b7fbeb19570b012" exitCode=0 Dec 03 12:45:04 crc kubenswrapper[4666]: I1203 12:45:04.760587 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" event={"ID":"a9f1cc55-1f71-490b-8def-9902e96f803a","Type":"ContainerDied","Data":"c07add67fbe553b507eb70160f5e6481f96a10cb8cc0f8b64b7fbeb19570b012"} Dec 03 12:45:04 crc kubenswrapper[4666]: I1203 12:45:04.762539 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7ljg8" event={"ID":"d22dc9c8-5134-46ec-ba0f-c2b334490035","Type":"ContainerStarted","Data":"0b47f7cee016ee1af5123f69f109c02adc62dccb78c897fbd568bcf9de53c150"} Dec 03 12:45:04 crc kubenswrapper[4666]: I1203 12:45:04.793069 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7ljg8" podStartSLOduration=2.617711662 podStartE2EDuration="18.79304798s" podCreationTimestamp="2025-12-03 12:44:46 +0000 UTC" firstStartedPulling="2025-12-03 12:44:47.058333978 +0000 UTC m=+1875.903295029" lastFinishedPulling="2025-12-03 12:45:03.233670286 +0000 UTC m=+1892.078631347" observedRunningTime="2025-12-03 12:45:04.785891366 +0000 UTC m=+1893.630852427" watchObservedRunningTime="2025-12-03 12:45:04.79304798 +0000 UTC m=+1893.638009031" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.101060 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.277870 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9f1cc55-1f71-490b-8def-9902e96f803a-secret-volume\") pod \"a9f1cc55-1f71-490b-8def-9902e96f803a\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.278080 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfcvd\" (UniqueName: \"kubernetes.io/projected/a9f1cc55-1f71-490b-8def-9902e96f803a-kube-api-access-zfcvd\") pod \"a9f1cc55-1f71-490b-8def-9902e96f803a\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.278137 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9f1cc55-1f71-490b-8def-9902e96f803a-config-volume\") pod \"a9f1cc55-1f71-490b-8def-9902e96f803a\" (UID: \"a9f1cc55-1f71-490b-8def-9902e96f803a\") " Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.279165 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f1cc55-1f71-490b-8def-9902e96f803a-config-volume" (OuterVolumeSpecName: "config-volume") pod "a9f1cc55-1f71-490b-8def-9902e96f803a" (UID: "a9f1cc55-1f71-490b-8def-9902e96f803a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.283125 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f1cc55-1f71-490b-8def-9902e96f803a-kube-api-access-zfcvd" (OuterVolumeSpecName: "kube-api-access-zfcvd") pod "a9f1cc55-1f71-490b-8def-9902e96f803a" (UID: "a9f1cc55-1f71-490b-8def-9902e96f803a"). InnerVolumeSpecName "kube-api-access-zfcvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.283439 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f1cc55-1f71-490b-8def-9902e96f803a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a9f1cc55-1f71-490b-8def-9902e96f803a" (UID: "a9f1cc55-1f71-490b-8def-9902e96f803a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.380532 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfcvd\" (UniqueName: \"kubernetes.io/projected/a9f1cc55-1f71-490b-8def-9902e96f803a-kube-api-access-zfcvd\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.380609 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9f1cc55-1f71-490b-8def-9902e96f803a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.380636 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9f1cc55-1f71-490b-8def-9902e96f803a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.778042 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" event={"ID":"a9f1cc55-1f71-490b-8def-9902e96f803a","Type":"ContainerDied","Data":"ba266aec2e6c4ff4714bed6705f592dbb3353ee3d1f3f82664a2b437404a2f44"} Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.778319 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba266aec2e6c4ff4714bed6705f592dbb3353ee3d1f3f82664a2b437404a2f44" Dec 03 12:45:06 crc kubenswrapper[4666]: I1203 12:45:06.778151 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.489318 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.616389 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.841170 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vzd52"] Dec 03 12:45:07 crc kubenswrapper[4666]: E1203 12:45:07.841525 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f1cc55-1f71-490b-8def-9902e96f803a" containerName="collect-profiles" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.841542 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f1cc55-1f71-490b-8def-9902e96f803a" containerName="collect-profiles" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.841693 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f1cc55-1f71-490b-8def-9902e96f803a" containerName="collect-profiles" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.843719 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.851358 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vzd52"] Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.942051 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jjp74"] Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.943213 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.950302 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1a6f-account-create-update-gr78s"] Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.951640 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.958490 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.959163 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jjp74"] Dec 03 12:45:07 crc kubenswrapper[4666]: I1203 12:45:07.965700 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1a6f-account-create-update-gr78s"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.008429 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k622j\" (UniqueName: \"kubernetes.io/projected/d678da1e-e438-4e63-99e4-8eb737a077f5-kube-api-access-k622j\") pod \"cinder-db-create-vzd52\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.008528 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d678da1e-e438-4e63-99e4-8eb737a077f5-operator-scripts\") pod \"cinder-db-create-vzd52\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.045558 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0d69-account-create-update-bhx5p"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.057278 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.063511 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.070900 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0d69-account-create-update-bhx5p"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.110830 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/979de73a-2c31-41fa-aeb4-fab22feeacc0-operator-scripts\") pod \"barbican-db-create-jjp74\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.110924 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrpkn\" (UniqueName: \"kubernetes.io/projected/0a0e6849-2521-4df6-b2f2-667769034675-kube-api-access-qrpkn\") pod \"cinder-1a6f-account-create-update-gr78s\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.110989 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a0e6849-2521-4df6-b2f2-667769034675-operator-scripts\") pod \"cinder-1a6f-account-create-update-gr78s\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.111078 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh4x8\" (UniqueName: \"kubernetes.io/projected/979de73a-2c31-41fa-aeb4-fab22feeacc0-kube-api-access-zh4x8\") pod \"barbican-db-create-jjp74\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.111212 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k622j\" (UniqueName: \"kubernetes.io/projected/d678da1e-e438-4e63-99e4-8eb737a077f5-kube-api-access-k622j\") pod \"cinder-db-create-vzd52\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.111262 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d678da1e-e438-4e63-99e4-8eb737a077f5-operator-scripts\") pod \"cinder-db-create-vzd52\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.119930 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d678da1e-e438-4e63-99e4-8eb737a077f5-operator-scripts\") pod \"cinder-db-create-vzd52\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.123821 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-72mb9"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.124915 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.129492 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.129718 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.129914 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.130037 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l82fb" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.132838 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-72mb9"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.139738 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k622j\" (UniqueName: \"kubernetes.io/projected/d678da1e-e438-4e63-99e4-8eb737a077f5-kube-api-access-k622j\") pod \"cinder-db-create-vzd52\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.170156 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218174 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-config-data\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218249 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-combined-ca-bundle\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218307 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqlr4\" (UniqueName: \"kubernetes.io/projected/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-kube-api-access-hqlr4\") pod \"barbican-0d69-account-create-update-bhx5p\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218331 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbz94\" (UniqueName: \"kubernetes.io/projected/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-kube-api-access-mbz94\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218366 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/979de73a-2c31-41fa-aeb4-fab22feeacc0-operator-scripts\") pod \"barbican-db-create-jjp74\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218399 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrpkn\" (UniqueName: \"kubernetes.io/projected/0a0e6849-2521-4df6-b2f2-667769034675-kube-api-access-qrpkn\") pod \"cinder-1a6f-account-create-update-gr78s\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218440 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a0e6849-2521-4df6-b2f2-667769034675-operator-scripts\") pod \"cinder-1a6f-account-create-update-gr78s\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218471 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh4x8\" (UniqueName: \"kubernetes.io/projected/979de73a-2c31-41fa-aeb4-fab22feeacc0-kube-api-access-zh4x8\") pod \"barbican-db-create-jjp74\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.218970 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-operator-scripts\") pod \"barbican-0d69-account-create-update-bhx5p\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.219585 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a0e6849-2521-4df6-b2f2-667769034675-operator-scripts\") pod \"cinder-1a6f-account-create-update-gr78s\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.219593 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/979de73a-2c31-41fa-aeb4-fab22feeacc0-operator-scripts\") pod \"barbican-db-create-jjp74\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.237688 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-22jn4"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.238945 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.241864 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrpkn\" (UniqueName: \"kubernetes.io/projected/0a0e6849-2521-4df6-b2f2-667769034675-kube-api-access-qrpkn\") pod \"cinder-1a6f-account-create-update-gr78s\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.247805 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-22jn4"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.293642 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.293896 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh4x8\" (UniqueName: \"kubernetes.io/projected/979de73a-2c31-41fa-aeb4-fab22feeacc0-kube-api-access-zh4x8\") pod \"barbican-db-create-jjp74\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.320373 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbz94\" (UniqueName: \"kubernetes.io/projected/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-kube-api-access-mbz94\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.320428 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5gtf\" (UniqueName: \"kubernetes.io/projected/823d7744-f03e-4ed5-b16b-823cf42f9084-kube-api-access-v5gtf\") pod \"neutron-db-create-22jn4\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.320487 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823d7744-f03e-4ed5-b16b-823cf42f9084-operator-scripts\") pod \"neutron-db-create-22jn4\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.320526 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-operator-scripts\") pod \"barbican-0d69-account-create-update-bhx5p\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.320546 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-config-data\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.320594 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-combined-ca-bundle\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.320665 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqlr4\" (UniqueName: \"kubernetes.io/projected/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-kube-api-access-hqlr4\") pod \"barbican-0d69-account-create-update-bhx5p\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.323320 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-operator-scripts\") pod \"barbican-0d69-account-create-update-bhx5p\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.325729 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-config-data\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.326552 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-combined-ca-bundle\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.341251 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqlr4\" (UniqueName: \"kubernetes.io/projected/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-kube-api-access-hqlr4\") pod \"barbican-0d69-account-create-update-bhx5p\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.341865 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbz94\" (UniqueName: \"kubernetes.io/projected/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-kube-api-access-mbz94\") pod \"keystone-db-sync-72mb9\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.344181 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-88bf-account-create-update-t9qsf"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.345254 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.348573 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.366841 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-88bf-account-create-update-t9qsf"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.374594 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.422016 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823d7744-f03e-4ed5-b16b-823cf42f9084-operator-scripts\") pod \"neutron-db-create-22jn4\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.422178 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5gtf\" (UniqueName: \"kubernetes.io/projected/823d7744-f03e-4ed5-b16b-823cf42f9084-kube-api-access-v5gtf\") pod \"neutron-db-create-22jn4\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.422730 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823d7744-f03e-4ed5-b16b-823cf42f9084-operator-scripts\") pod \"neutron-db-create-22jn4\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.442615 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5gtf\" (UniqueName: \"kubernetes.io/projected/823d7744-f03e-4ed5-b16b-823cf42f9084-kube-api-access-v5gtf\") pod \"neutron-db-create-22jn4\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.478648 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.523705 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aae6d2ff-7625-402d-8b3d-ac215f993f2c-operator-scripts\") pod \"neutron-88bf-account-create-update-t9qsf\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.523779 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6l5\" (UniqueName: \"kubernetes.io/projected/aae6d2ff-7625-402d-8b3d-ac215f993f2c-kube-api-access-5m6l5\") pod \"neutron-88bf-account-create-update-t9qsf\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.558717 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.612501 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.624663 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aae6d2ff-7625-402d-8b3d-ac215f993f2c-operator-scripts\") pod \"neutron-88bf-account-create-update-t9qsf\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.624753 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6l5\" (UniqueName: \"kubernetes.io/projected/aae6d2ff-7625-402d-8b3d-ac215f993f2c-kube-api-access-5m6l5\") pod \"neutron-88bf-account-create-update-t9qsf\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.625415 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aae6d2ff-7625-402d-8b3d-ac215f993f2c-operator-scripts\") pod \"neutron-88bf-account-create-update-t9qsf\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.642862 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6l5\" (UniqueName: \"kubernetes.io/projected/aae6d2ff-7625-402d-8b3d-ac215f993f2c-kube-api-access-5m6l5\") pod \"neutron-88bf-account-create-update-t9qsf\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.679018 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.732365 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vzd52"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.790396 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1a6f-account-create-update-gr78s"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.800449 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vzd52" event={"ID":"d678da1e-e438-4e63-99e4-8eb737a077f5","Type":"ContainerStarted","Data":"553227ea0561d1115423a71519ddcdf4219d9032916865b80f4c6da7479362d9"} Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.921022 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0d69-account-create-update-bhx5p"] Dec 03 12:45:08 crc kubenswrapper[4666]: I1203 12:45:08.984095 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-72mb9"] Dec 03 12:45:09 crc kubenswrapper[4666]: W1203 12:45:09.018684 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1648693_9c33_4c6b_94d9_fa47ea5b38d4.slice/crio-517dce29e5bf7e08e6339d4ab9f34debf920ba3bc28565c3d2d3b7fc0bd14564 WatchSource:0}: Error finding container 517dce29e5bf7e08e6339d4ab9f34debf920ba3bc28565c3d2d3b7fc0bd14564: Status 404 returned error can't find the container with id 517dce29e5bf7e08e6339d4ab9f34debf920ba3bc28565c3d2d3b7fc0bd14564 Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.095554 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jjp74"] Dec 03 12:45:09 crc kubenswrapper[4666]: W1203 12:45:09.101858 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979de73a_2c31_41fa_aeb4_fab22feeacc0.slice/crio-566e0c22b1df949f337edf9ce6e29aa83dd5d496d98a341d75ff2f8bf15c77d5 WatchSource:0}: Error finding container 566e0c22b1df949f337edf9ce6e29aa83dd5d496d98a341d75ff2f8bf15c77d5: Status 404 returned error can't find the container with id 566e0c22b1df949f337edf9ce6e29aa83dd5d496d98a341d75ff2f8bf15c77d5 Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.178576 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-88bf-account-create-update-t9qsf"] Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.189728 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-22jn4"] Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.842328 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0d69-account-create-update-bhx5p" event={"ID":"6b9f6c9f-cd12-43d6-b79b-240db68c0e88","Type":"ContainerStarted","Data":"a1515810d3ef40520c3156b32c4ca9d6541943ef1ffbe086c4e25eafcb552d18"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.842380 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0d69-account-create-update-bhx5p" event={"ID":"6b9f6c9f-cd12-43d6-b79b-240db68c0e88","Type":"ContainerStarted","Data":"7a59b51e1e37221cbc49c5e40c38f8a2d090f7f1bd59f0c01bba8e6b333e5590"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.844930 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a6f-account-create-update-gr78s" event={"ID":"0a0e6849-2521-4df6-b2f2-667769034675","Type":"ContainerStarted","Data":"7206ef27d76935440c78e0761162410caad11f5f1cec71ff05893027332a49ec"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.844952 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a6f-account-create-update-gr78s" event={"ID":"0a0e6849-2521-4df6-b2f2-667769034675","Type":"ContainerStarted","Data":"1cedbea56a6a65347fefacd714dffffa3b24f477c08f53b39229df1cc1d6938d"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.846381 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-88bf-account-create-update-t9qsf" event={"ID":"aae6d2ff-7625-402d-8b3d-ac215f993f2c","Type":"ContainerStarted","Data":"0208e379195b8d165c9277f636ade29dd399cd283bfd12c5369ec8b6e422b130"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.846404 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-88bf-account-create-update-t9qsf" event={"ID":"aae6d2ff-7625-402d-8b3d-ac215f993f2c","Type":"ContainerStarted","Data":"23a41a5cc36c10aefb8b510f2513191af252a48d1171f3ccabe68dfe4a8d2fcc"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.847826 4666 generic.go:334] "Generic (PLEG): container finished" podID="d678da1e-e438-4e63-99e4-8eb737a077f5" containerID="f9f7c4ce5bd86704f5384d052989d2abc2521f44ae6afa2cc63324931b06015c" exitCode=0 Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.847900 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vzd52" event={"ID":"d678da1e-e438-4e63-99e4-8eb737a077f5","Type":"ContainerDied","Data":"f9f7c4ce5bd86704f5384d052989d2abc2521f44ae6afa2cc63324931b06015c"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.849065 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72mb9" event={"ID":"b1648693-9c33-4c6b-94d9-fa47ea5b38d4","Type":"ContainerStarted","Data":"517dce29e5bf7e08e6339d4ab9f34debf920ba3bc28565c3d2d3b7fc0bd14564"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.853872 4666 generic.go:334] "Generic (PLEG): container finished" podID="823d7744-f03e-4ed5-b16b-823cf42f9084" containerID="9c0452d5372f8f01316d5ffb31e90b9081b1e9097324e4540db18c9249024106" exitCode=0 Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.854028 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-22jn4" event={"ID":"823d7744-f03e-4ed5-b16b-823cf42f9084","Type":"ContainerDied","Data":"9c0452d5372f8f01316d5ffb31e90b9081b1e9097324e4540db18c9249024106"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.854050 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-22jn4" event={"ID":"823d7744-f03e-4ed5-b16b-823cf42f9084","Type":"ContainerStarted","Data":"cce19e7962df45754e10fa5ff30e66f23ea302362860a588c73d786b4700cc3f"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.858736 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0d69-account-create-update-bhx5p" podStartSLOduration=1.858721596 podStartE2EDuration="1.858721596s" podCreationTimestamp="2025-12-03 12:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:09.85811818 +0000 UTC m=+1898.703079241" watchObservedRunningTime="2025-12-03 12:45:09.858721596 +0000 UTC m=+1898.703682647" Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.859854 4666 generic.go:334] "Generic (PLEG): container finished" podID="979de73a-2c31-41fa-aeb4-fab22feeacc0" containerID="cb34e89046aafa174a2aa065c35eafc36135635fa742ea558e1e42c0090e754d" exitCode=0 Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.859894 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jjp74" event={"ID":"979de73a-2c31-41fa-aeb4-fab22feeacc0","Type":"ContainerDied","Data":"cb34e89046aafa174a2aa065c35eafc36135635fa742ea558e1e42c0090e754d"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.859917 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jjp74" event={"ID":"979de73a-2c31-41fa-aeb4-fab22feeacc0","Type":"ContainerStarted","Data":"566e0c22b1df949f337edf9ce6e29aa83dd5d496d98a341d75ff2f8bf15c77d5"} Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.886568 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-1a6f-account-create-update-gr78s" podStartSLOduration=2.886552938 podStartE2EDuration="2.886552938s" podCreationTimestamp="2025-12-03 12:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:09.88515537 +0000 UTC m=+1898.730116411" watchObservedRunningTime="2025-12-03 12:45:09.886552938 +0000 UTC m=+1898.731513989" Dec 03 12:45:09 crc kubenswrapper[4666]: I1203 12:45:09.902885 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-88bf-account-create-update-t9qsf" podStartSLOduration=1.902861878 podStartE2EDuration="1.902861878s" podCreationTimestamp="2025-12-03 12:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:09.898517611 +0000 UTC m=+1898.743478662" watchObservedRunningTime="2025-12-03 12:45:09.902861878 +0000 UTC m=+1898.747822929" Dec 03 12:45:10 crc kubenswrapper[4666]: I1203 12:45:10.875132 4666 generic.go:334] "Generic (PLEG): container finished" podID="0a0e6849-2521-4df6-b2f2-667769034675" containerID="7206ef27d76935440c78e0761162410caad11f5f1cec71ff05893027332a49ec" exitCode=0 Dec 03 12:45:10 crc kubenswrapper[4666]: I1203 12:45:10.875706 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a6f-account-create-update-gr78s" event={"ID":"0a0e6849-2521-4df6-b2f2-667769034675","Type":"ContainerDied","Data":"7206ef27d76935440c78e0761162410caad11f5f1cec71ff05893027332a49ec"} Dec 03 12:45:10 crc kubenswrapper[4666]: I1203 12:45:10.879659 4666 generic.go:334] "Generic (PLEG): container finished" podID="aae6d2ff-7625-402d-8b3d-ac215f993f2c" containerID="0208e379195b8d165c9277f636ade29dd399cd283bfd12c5369ec8b6e422b130" exitCode=0 Dec 03 12:45:10 crc kubenswrapper[4666]: I1203 12:45:10.879890 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-88bf-account-create-update-t9qsf" event={"ID":"aae6d2ff-7625-402d-8b3d-ac215f993f2c","Type":"ContainerDied","Data":"0208e379195b8d165c9277f636ade29dd399cd283bfd12c5369ec8b6e422b130"} Dec 03 12:45:10 crc kubenswrapper[4666]: I1203 12:45:10.886163 4666 generic.go:334] "Generic (PLEG): container finished" podID="6b9f6c9f-cd12-43d6-b79b-240db68c0e88" containerID="a1515810d3ef40520c3156b32c4ca9d6541943ef1ffbe086c4e25eafcb552d18" exitCode=0 Dec 03 12:45:10 crc kubenswrapper[4666]: I1203 12:45:10.886497 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0d69-account-create-update-bhx5p" event={"ID":"6b9f6c9f-cd12-43d6-b79b-240db68c0e88","Type":"ContainerDied","Data":"a1515810d3ef40520c3156b32c4ca9d6541943ef1ffbe086c4e25eafcb552d18"} Dec 03 12:45:11 crc kubenswrapper[4666]: I1203 12:45:11.528477 4666 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podcf41bd6e-97a2-4fe8-abab-ef09475a7c9b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podcf41bd6e-97a2-4fe8-abab-ef09475a7c9b] : Timed out while waiting for systemd to remove kubepods-besteffort-podcf41bd6e_97a2_4fe8_abab_ef09475a7c9b.slice" Dec 03 12:45:11 crc kubenswrapper[4666]: E1203 12:45:11.528817 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podcf41bd6e-97a2-4fe8-abab-ef09475a7c9b] : unable to destroy cgroup paths for cgroup [kubepods besteffort podcf41bd6e-97a2-4fe8-abab-ef09475a7c9b] : Timed out while waiting for systemd to remove kubepods-besteffort-podcf41bd6e_97a2_4fe8_abab_ef09475a7c9b.slice" pod="openstack/glance-f6df-account-create-update-rnxbk" podUID="cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" Dec 03 12:45:11 crc kubenswrapper[4666]: I1203 12:45:11.895594 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f6df-account-create-update-rnxbk" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.867515 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.876904 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.878205 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.890812 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.922381 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.922846 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a6f-account-create-update-gr78s" event={"ID":"0a0e6849-2521-4df6-b2f2-667769034675","Type":"ContainerDied","Data":"1cedbea56a6a65347fefacd714dffffa3b24f477c08f53b39229df1cc1d6938d"} Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.922886 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cedbea56a6a65347fefacd714dffffa3b24f477c08f53b39229df1cc1d6938d" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.922942 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a6f-account-create-update-gr78s" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.924792 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-88bf-account-create-update-t9qsf" event={"ID":"aae6d2ff-7625-402d-8b3d-ac215f993f2c","Type":"ContainerDied","Data":"23a41a5cc36c10aefb8b510f2513191af252a48d1171f3ccabe68dfe4a8d2fcc"} Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.924813 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a41a5cc36c10aefb8b510f2513191af252a48d1171f3ccabe68dfe4a8d2fcc" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.924973 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.925764 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vzd52" event={"ID":"d678da1e-e438-4e63-99e4-8eb737a077f5","Type":"ContainerDied","Data":"553227ea0561d1115423a71519ddcdf4219d9032916865b80f4c6da7479362d9"} Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.925786 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="553227ea0561d1115423a71519ddcdf4219d9032916865b80f4c6da7479362d9" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.925818 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vzd52" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.927326 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-22jn4" event={"ID":"823d7744-f03e-4ed5-b16b-823cf42f9084","Type":"ContainerDied","Data":"cce19e7962df45754e10fa5ff30e66f23ea302362860a588c73d786b4700cc3f"} Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.927348 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce19e7962df45754e10fa5ff30e66f23ea302362860a588c73d786b4700cc3f" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.927364 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-22jn4" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.928964 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jjp74" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.928971 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jjp74" event={"ID":"979de73a-2c31-41fa-aeb4-fab22feeacc0","Type":"ContainerDied","Data":"566e0c22b1df949f337edf9ce6e29aa83dd5d496d98a341d75ff2f8bf15c77d5"} Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.929001 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="566e0c22b1df949f337edf9ce6e29aa83dd5d496d98a341d75ff2f8bf15c77d5" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.930046 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0d69-account-create-update-bhx5p" event={"ID":"6b9f6c9f-cd12-43d6-b79b-240db68c0e88","Type":"ContainerDied","Data":"7a59b51e1e37221cbc49c5e40c38f8a2d090f7f1bd59f0c01bba8e6b333e5590"} Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.930068 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a59b51e1e37221cbc49c5e40c38f8a2d090f7f1bd59f0c01bba8e6b333e5590" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.930126 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0d69-account-create-update-bhx5p" Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.931297 4666 generic.go:334] "Generic (PLEG): container finished" podID="d22dc9c8-5134-46ec-ba0f-c2b334490035" containerID="0b47f7cee016ee1af5123f69f109c02adc62dccb78c897fbd568bcf9de53c150" exitCode=0 Dec 03 12:45:13 crc kubenswrapper[4666]: I1203 12:45:13.931322 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7ljg8" event={"ID":"d22dc9c8-5134-46ec-ba0f-c2b334490035","Type":"ContainerDied","Data":"0b47f7cee016ee1af5123f69f109c02adc62dccb78c897fbd568bcf9de53c150"} Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020653 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m6l5\" (UniqueName: \"kubernetes.io/projected/aae6d2ff-7625-402d-8b3d-ac215f993f2c-kube-api-access-5m6l5\") pod \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020706 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-operator-scripts\") pod \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020732 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh4x8\" (UniqueName: \"kubernetes.io/projected/979de73a-2c31-41fa-aeb4-fab22feeacc0-kube-api-access-zh4x8\") pod \"979de73a-2c31-41fa-aeb4-fab22feeacc0\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020793 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k622j\" (UniqueName: \"kubernetes.io/projected/d678da1e-e438-4e63-99e4-8eb737a077f5-kube-api-access-k622j\") pod \"d678da1e-e438-4e63-99e4-8eb737a077f5\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020874 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrpkn\" (UniqueName: \"kubernetes.io/projected/0a0e6849-2521-4df6-b2f2-667769034675-kube-api-access-qrpkn\") pod \"0a0e6849-2521-4df6-b2f2-667769034675\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020916 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5gtf\" (UniqueName: \"kubernetes.io/projected/823d7744-f03e-4ed5-b16b-823cf42f9084-kube-api-access-v5gtf\") pod \"823d7744-f03e-4ed5-b16b-823cf42f9084\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020943 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823d7744-f03e-4ed5-b16b-823cf42f9084-operator-scripts\") pod \"823d7744-f03e-4ed5-b16b-823cf42f9084\" (UID: \"823d7744-f03e-4ed5-b16b-823cf42f9084\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020963 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aae6d2ff-7625-402d-8b3d-ac215f993f2c-operator-scripts\") pod \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\" (UID: \"aae6d2ff-7625-402d-8b3d-ac215f993f2c\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.020983 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a0e6849-2521-4df6-b2f2-667769034675-operator-scripts\") pod \"0a0e6849-2521-4df6-b2f2-667769034675\" (UID: \"0a0e6849-2521-4df6-b2f2-667769034675\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.021002 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqlr4\" (UniqueName: \"kubernetes.io/projected/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-kube-api-access-hqlr4\") pod \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\" (UID: \"6b9f6c9f-cd12-43d6-b79b-240db68c0e88\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.021027 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/979de73a-2c31-41fa-aeb4-fab22feeacc0-operator-scripts\") pod \"979de73a-2c31-41fa-aeb4-fab22feeacc0\" (UID: \"979de73a-2c31-41fa-aeb4-fab22feeacc0\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.021106 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d678da1e-e438-4e63-99e4-8eb737a077f5-operator-scripts\") pod \"d678da1e-e438-4e63-99e4-8eb737a077f5\" (UID: \"d678da1e-e438-4e63-99e4-8eb737a077f5\") " Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.021846 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/823d7744-f03e-4ed5-b16b-823cf42f9084-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "823d7744-f03e-4ed5-b16b-823cf42f9084" (UID: "823d7744-f03e-4ed5-b16b-823cf42f9084"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.021940 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a0e6849-2521-4df6-b2f2-667769034675-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a0e6849-2521-4df6-b2f2-667769034675" (UID: "0a0e6849-2521-4df6-b2f2-667769034675"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.022029 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d678da1e-e438-4e63-99e4-8eb737a077f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d678da1e-e438-4e63-99e4-8eb737a077f5" (UID: "d678da1e-e438-4e63-99e4-8eb737a077f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.022190 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/979de73a-2c31-41fa-aeb4-fab22feeacc0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "979de73a-2c31-41fa-aeb4-fab22feeacc0" (UID: "979de73a-2c31-41fa-aeb4-fab22feeacc0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.022247 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b9f6c9f-cd12-43d6-b79b-240db68c0e88" (UID: "6b9f6c9f-cd12-43d6-b79b-240db68c0e88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.022404 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae6d2ff-7625-402d-8b3d-ac215f993f2c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aae6d2ff-7625-402d-8b3d-ac215f993f2c" (UID: "aae6d2ff-7625-402d-8b3d-ac215f993f2c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.026288 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0e6849-2521-4df6-b2f2-667769034675-kube-api-access-qrpkn" (OuterVolumeSpecName: "kube-api-access-qrpkn") pod "0a0e6849-2521-4df6-b2f2-667769034675" (UID: "0a0e6849-2521-4df6-b2f2-667769034675"). InnerVolumeSpecName "kube-api-access-qrpkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.026318 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823d7744-f03e-4ed5-b16b-823cf42f9084-kube-api-access-v5gtf" (OuterVolumeSpecName: "kube-api-access-v5gtf") pod "823d7744-f03e-4ed5-b16b-823cf42f9084" (UID: "823d7744-f03e-4ed5-b16b-823cf42f9084"). InnerVolumeSpecName "kube-api-access-v5gtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.026308 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-kube-api-access-hqlr4" (OuterVolumeSpecName: "kube-api-access-hqlr4") pod "6b9f6c9f-cd12-43d6-b79b-240db68c0e88" (UID: "6b9f6c9f-cd12-43d6-b79b-240db68c0e88"). InnerVolumeSpecName "kube-api-access-hqlr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.026396 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d678da1e-e438-4e63-99e4-8eb737a077f5-kube-api-access-k622j" (OuterVolumeSpecName: "kube-api-access-k622j") pod "d678da1e-e438-4e63-99e4-8eb737a077f5" (UID: "d678da1e-e438-4e63-99e4-8eb737a077f5"). InnerVolumeSpecName "kube-api-access-k622j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.026694 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae6d2ff-7625-402d-8b3d-ac215f993f2c-kube-api-access-5m6l5" (OuterVolumeSpecName: "kube-api-access-5m6l5") pod "aae6d2ff-7625-402d-8b3d-ac215f993f2c" (UID: "aae6d2ff-7625-402d-8b3d-ac215f993f2c"). InnerVolumeSpecName "kube-api-access-5m6l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.027453 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979de73a-2c31-41fa-aeb4-fab22feeacc0-kube-api-access-zh4x8" (OuterVolumeSpecName: "kube-api-access-zh4x8") pod "979de73a-2c31-41fa-aeb4-fab22feeacc0" (UID: "979de73a-2c31-41fa-aeb4-fab22feeacc0"). InnerVolumeSpecName "kube-api-access-zh4x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.123311 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrpkn\" (UniqueName: \"kubernetes.io/projected/0a0e6849-2521-4df6-b2f2-667769034675-kube-api-access-qrpkn\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.123547 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5gtf\" (UniqueName: \"kubernetes.io/projected/823d7744-f03e-4ed5-b16b-823cf42f9084-kube-api-access-v5gtf\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.123622 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823d7744-f03e-4ed5-b16b-823cf42f9084-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.123785 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aae6d2ff-7625-402d-8b3d-ac215f993f2c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.123854 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a0e6849-2521-4df6-b2f2-667769034675-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.123917 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqlr4\" (UniqueName: \"kubernetes.io/projected/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-kube-api-access-hqlr4\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.123977 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/979de73a-2c31-41fa-aeb4-fab22feeacc0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.124036 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d678da1e-e438-4e63-99e4-8eb737a077f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.124107 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m6l5\" (UniqueName: \"kubernetes.io/projected/aae6d2ff-7625-402d-8b3d-ac215f993f2c-kube-api-access-5m6l5\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.124246 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f6c9f-cd12-43d6-b79b-240db68c0e88-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.124311 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh4x8\" (UniqueName: \"kubernetes.io/projected/979de73a-2c31-41fa-aeb4-fab22feeacc0-kube-api-access-zh4x8\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.124365 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k622j\" (UniqueName: \"kubernetes.io/projected/d678da1e-e438-4e63-99e4-8eb737a077f5-kube-api-access-k622j\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.425101 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:45:14 crc kubenswrapper[4666]: E1203 12:45:14.425390 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.940975 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-88bf-account-create-update-t9qsf" Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.947119 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72mb9" event={"ID":"b1648693-9c33-4c6b-94d9-fa47ea5b38d4","Type":"ContainerStarted","Data":"5978737f84e7edf423b566c03c708dd64dd1d9993effc3fbc93dd0b9cb2343b0"} Dec 03 12:45:14 crc kubenswrapper[4666]: I1203 12:45:14.991722 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-72mb9" podStartSLOduration=2.260669055 podStartE2EDuration="6.991706272s" podCreationTimestamp="2025-12-03 12:45:08 +0000 UTC" firstStartedPulling="2025-12-03 12:45:09.024678587 +0000 UTC m=+1897.869639638" lastFinishedPulling="2025-12-03 12:45:13.755715804 +0000 UTC m=+1902.600676855" observedRunningTime="2025-12-03 12:45:14.96942993 +0000 UTC m=+1903.814390991" watchObservedRunningTime="2025-12-03 12:45:14.991706272 +0000 UTC m=+1903.836667323" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.310007 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7ljg8" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.455121 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-config-data\") pod \"d22dc9c8-5134-46ec-ba0f-c2b334490035\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.455226 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89h9h\" (UniqueName: \"kubernetes.io/projected/d22dc9c8-5134-46ec-ba0f-c2b334490035-kube-api-access-89h9h\") pod \"d22dc9c8-5134-46ec-ba0f-c2b334490035\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.455324 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-db-sync-config-data\") pod \"d22dc9c8-5134-46ec-ba0f-c2b334490035\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.455463 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-combined-ca-bundle\") pod \"d22dc9c8-5134-46ec-ba0f-c2b334490035\" (UID: \"d22dc9c8-5134-46ec-ba0f-c2b334490035\") " Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.459245 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d22dc9c8-5134-46ec-ba0f-c2b334490035" (UID: "d22dc9c8-5134-46ec-ba0f-c2b334490035"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.459283 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22dc9c8-5134-46ec-ba0f-c2b334490035-kube-api-access-89h9h" (OuterVolumeSpecName: "kube-api-access-89h9h") pod "d22dc9c8-5134-46ec-ba0f-c2b334490035" (UID: "d22dc9c8-5134-46ec-ba0f-c2b334490035"). InnerVolumeSpecName "kube-api-access-89h9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.497823 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d22dc9c8-5134-46ec-ba0f-c2b334490035" (UID: "d22dc9c8-5134-46ec-ba0f-c2b334490035"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.506821 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-config-data" (OuterVolumeSpecName: "config-data") pod "d22dc9c8-5134-46ec-ba0f-c2b334490035" (UID: "d22dc9c8-5134-46ec-ba0f-c2b334490035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.557770 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.557803 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89h9h\" (UniqueName: \"kubernetes.io/projected/d22dc9c8-5134-46ec-ba0f-c2b334490035-kube-api-access-89h9h\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.557813 4666 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.557823 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22dc9c8-5134-46ec-ba0f-c2b334490035-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.951506 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7ljg8" Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.951492 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7ljg8" event={"ID":"d22dc9c8-5134-46ec-ba0f-c2b334490035","Type":"ContainerDied","Data":"7ea9573f6e47dbe1fd690239e96a74aa77e79b9938713a7ff5ad4b61c9ac8302"} Dec 03 12:45:15 crc kubenswrapper[4666]: I1203 12:45:15.951569 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea9573f6e47dbe1fd690239e96a74aa77e79b9938713a7ff5ad4b61c9ac8302" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.343750 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-2rmm6"] Dec 03 12:45:16 crc kubenswrapper[4666]: E1203 12:45:16.344453 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae6d2ff-7625-402d-8b3d-ac215f993f2c" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344477 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae6d2ff-7625-402d-8b3d-ac215f993f2c" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: E1203 12:45:16.344498 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9f6c9f-cd12-43d6-b79b-240db68c0e88" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344507 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9f6c9f-cd12-43d6-b79b-240db68c0e88" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: E1203 12:45:16.344518 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0e6849-2521-4df6-b2f2-667769034675" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344528 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0e6849-2521-4df6-b2f2-667769034675" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: E1203 12:45:16.344539 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d678da1e-e438-4e63-99e4-8eb737a077f5" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344547 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d678da1e-e438-4e63-99e4-8eb737a077f5" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: E1203 12:45:16.344568 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823d7744-f03e-4ed5-b16b-823cf42f9084" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344575 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="823d7744-f03e-4ed5-b16b-823cf42f9084" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: E1203 12:45:16.344590 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979de73a-2c31-41fa-aeb4-fab22feeacc0" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344600 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="979de73a-2c31-41fa-aeb4-fab22feeacc0" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: E1203 12:45:16.344620 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22dc9c8-5134-46ec-ba0f-c2b334490035" containerName="glance-db-sync" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344628 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22dc9c8-5134-46ec-ba0f-c2b334490035" containerName="glance-db-sync" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344815 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="823d7744-f03e-4ed5-b16b-823cf42f9084" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344874 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae6d2ff-7625-402d-8b3d-ac215f993f2c" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344903 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d678da1e-e438-4e63-99e4-8eb737a077f5" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344925 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0e6849-2521-4df6-b2f2-667769034675" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344946 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9f6c9f-cd12-43d6-b79b-240db68c0e88" containerName="mariadb-account-create-update" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344968 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22dc9c8-5134-46ec-ba0f-c2b334490035" containerName="glance-db-sync" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.344987 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="979de73a-2c31-41fa-aeb4-fab22feeacc0" containerName="mariadb-database-create" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.345998 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.379542 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-2rmm6"] Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.471734 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpc25\" (UniqueName: \"kubernetes.io/projected/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-kube-api-access-mpc25\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.471818 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.471837 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.471863 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.471886 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-config\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.573892 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpc25\" (UniqueName: \"kubernetes.io/projected/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-kube-api-access-mpc25\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.575251 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.575296 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.575321 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.576050 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.576137 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-config\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.576221 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.576321 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.576655 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-config\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.596462 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpc25\" (UniqueName: \"kubernetes.io/projected/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-kube-api-access-mpc25\") pod \"dnsmasq-dns-54f9b7b8d9-2rmm6\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.678543 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.963872 4666 generic.go:334] "Generic (PLEG): container finished" podID="b1648693-9c33-4c6b-94d9-fa47ea5b38d4" containerID="5978737f84e7edf423b566c03c708dd64dd1d9993effc3fbc93dd0b9cb2343b0" exitCode=0 Dec 03 12:45:16 crc kubenswrapper[4666]: I1203 12:45:16.964058 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72mb9" event={"ID":"b1648693-9c33-4c6b-94d9-fa47ea5b38d4","Type":"ContainerDied","Data":"5978737f84e7edf423b566c03c708dd64dd1d9993effc3fbc93dd0b9cb2343b0"} Dec 03 12:45:17 crc kubenswrapper[4666]: I1203 12:45:17.188535 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-2rmm6"] Dec 03 12:45:17 crc kubenswrapper[4666]: W1203 12:45:17.194698 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2642aa0c_c8c1_40f5_93bd_e4ffae6064de.slice/crio-12567a21492c979a40ce19ceae732232b4c276bca497707a907fc66b344d5fd4 WatchSource:0}: Error finding container 12567a21492c979a40ce19ceae732232b4c276bca497707a907fc66b344d5fd4: Status 404 returned error can't find the container with id 12567a21492c979a40ce19ceae732232b4c276bca497707a907fc66b344d5fd4 Dec 03 12:45:17 crc kubenswrapper[4666]: I1203 12:45:17.974276 4666 generic.go:334] "Generic (PLEG): container finished" podID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerID="5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310" exitCode=0 Dec 03 12:45:17 crc kubenswrapper[4666]: I1203 12:45:17.974619 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" event={"ID":"2642aa0c-c8c1-40f5-93bd-e4ffae6064de","Type":"ContainerDied","Data":"5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310"} Dec 03 12:45:17 crc kubenswrapper[4666]: I1203 12:45:17.974663 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" event={"ID":"2642aa0c-c8c1-40f5-93bd-e4ffae6064de","Type":"ContainerStarted","Data":"12567a21492c979a40ce19ceae732232b4c276bca497707a907fc66b344d5fd4"} Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.290768 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.472333 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-config-data\") pod \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.472425 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbz94\" (UniqueName: \"kubernetes.io/projected/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-kube-api-access-mbz94\") pod \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.472463 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-combined-ca-bundle\") pod \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\" (UID: \"b1648693-9c33-4c6b-94d9-fa47ea5b38d4\") " Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.478600 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-kube-api-access-mbz94" (OuterVolumeSpecName: "kube-api-access-mbz94") pod "b1648693-9c33-4c6b-94d9-fa47ea5b38d4" (UID: "b1648693-9c33-4c6b-94d9-fa47ea5b38d4"). InnerVolumeSpecName "kube-api-access-mbz94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.507944 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1648693-9c33-4c6b-94d9-fa47ea5b38d4" (UID: "b1648693-9c33-4c6b-94d9-fa47ea5b38d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.514897 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-config-data" (OuterVolumeSpecName: "config-data") pod "b1648693-9c33-4c6b-94d9-fa47ea5b38d4" (UID: "b1648693-9c33-4c6b-94d9-fa47ea5b38d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.573682 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.573718 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.573727 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbz94\" (UniqueName: \"kubernetes.io/projected/b1648693-9c33-4c6b-94d9-fa47ea5b38d4-kube-api-access-mbz94\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.996177 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72mb9" event={"ID":"b1648693-9c33-4c6b-94d9-fa47ea5b38d4","Type":"ContainerDied","Data":"517dce29e5bf7e08e6339d4ab9f34debf920ba3bc28565c3d2d3b7fc0bd14564"} Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.997070 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="517dce29e5bf7e08e6339d4ab9f34debf920ba3bc28565c3d2d3b7fc0bd14564" Dec 03 12:45:18 crc kubenswrapper[4666]: I1203 12:45:18.996716 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72mb9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.007194 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" event={"ID":"2642aa0c-c8c1-40f5-93bd-e4ffae6064de","Type":"ContainerStarted","Data":"341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd"} Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.007376 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:19 crc kubenswrapper[4666]: E1203 12:45:19.029473 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1648693_9c33_4c6b_94d9_fa47ea5b38d4.slice\": RecentStats: unable to find data in memory cache]" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.037450 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" podStartSLOduration=3.037430168 podStartE2EDuration="3.037430168s" podCreationTimestamp="2025-12-03 12:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:19.034324184 +0000 UTC m=+1907.879285245" watchObservedRunningTime="2025-12-03 12:45:19.037430168 +0000 UTC m=+1907.882391209" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.247204 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-2rmm6"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.295717 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9vvzq"] Dec 03 12:45:19 crc kubenswrapper[4666]: E1203 12:45:19.296126 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1648693-9c33-4c6b-94d9-fa47ea5b38d4" containerName="keystone-db-sync" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.296141 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1648693-9c33-4c6b-94d9-fa47ea5b38d4" containerName="keystone-db-sync" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.296329 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1648693-9c33-4c6b-94d9-fa47ea5b38d4" containerName="keystone-db-sync" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.296886 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.317250 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-nhlzv"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.318601 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9vvzq"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.318706 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.349548 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.349681 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.349846 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l82fb" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.349922 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.350114 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-nhlzv"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.350507 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.390668 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-scripts\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.390745 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-credential-keys\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.390772 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-config-data\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.390848 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6x85\" (UniqueName: \"kubernetes.io/projected/1450736e-07b1-4cdd-b5da-8026bda84463-kube-api-access-g6x85\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.390918 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-combined-ca-bundle\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.390962 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-fernet-keys\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.492768 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-config\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.492816 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.492866 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6x85\" (UniqueName: \"kubernetes.io/projected/1450736e-07b1-4cdd-b5da-8026bda84463-kube-api-access-g6x85\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.492927 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-dns-svc\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.492952 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5t6\" (UniqueName: \"kubernetes.io/projected/e2b2c19d-119a-4ebb-b708-100f6eee79c3-kube-api-access-rq5t6\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.492990 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-combined-ca-bundle\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.493041 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-fernet-keys\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.493068 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.493131 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-scripts\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.493170 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-credential-keys\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.493194 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-config-data\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.506728 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-credential-keys\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.507207 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-config-data\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.509042 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-fernet-keys\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.509810 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-combined-ca-bundle\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.510284 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-scripts\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.523206 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6x85\" (UniqueName: \"kubernetes.io/projected/1450736e-07b1-4cdd-b5da-8026bda84463-kube-api-access-g6x85\") pod \"keystone-bootstrap-9vvzq\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.534227 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.537220 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.548617 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.549260 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.564767 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.594396 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.594441 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-config\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.594495 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-dns-svc\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.594514 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5t6\" (UniqueName: \"kubernetes.io/projected/e2b2c19d-119a-4ebb-b708-100f6eee79c3-kube-api-access-rq5t6\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.594578 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.595429 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.596685 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.597298 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-config\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.597549 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cnrgx"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.597846 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-dns-svc\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.598846 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.604653 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cz97w" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.604764 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.605000 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.628627 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5t6\" (UniqueName: \"kubernetes.io/projected/e2b2c19d-119a-4ebb-b708-100f6eee79c3-kube-api-access-rq5t6\") pod \"dnsmasq-dns-6546db6db7-nhlzv\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.635006 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vrrxv"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.637223 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.642704 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.642947 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.643113 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mlszd" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.654999 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-nhlzv"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.656205 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.672171 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.692957 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vrrxv"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.696871 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-log-httpd\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.696922 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-run-httpd\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.696957 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.696995 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-combined-ca-bundle\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.697061 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.697077 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-scripts\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.697105 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-config-data\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.697125 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggft\" (UniqueName: \"kubernetes.io/projected/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-kube-api-access-5ggft\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.697144 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-config\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.697226 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dt9m\" (UniqueName: \"kubernetes.io/projected/9ea43de1-1aef-4b87-a8e4-793ea2029687-kube-api-access-7dt9m\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.715134 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cnrgx"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.741501 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-pp2z9"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.743562 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.778207 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-v65m9"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.780957 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.790063 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.791347 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.791553 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-26w9m" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.794425 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-pp2z9"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799465 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-combined-ca-bundle\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799546 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-log-httpd\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799566 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-run-httpd\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799592 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799627 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-combined-ca-bundle\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799661 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4l2f\" (UniqueName: \"kubernetes.io/projected/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-kube-api-access-c4l2f\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799694 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799715 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-scripts\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799732 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-config-data\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799749 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-db-sync-config-data\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799764 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggft\" (UniqueName: \"kubernetes.io/projected/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-kube-api-access-5ggft\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799785 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-config\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799809 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-scripts\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799830 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-etc-machine-id\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799854 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-config-data\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.799881 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dt9m\" (UniqueName: \"kubernetes.io/projected/9ea43de1-1aef-4b87-a8e4-793ea2029687-kube-api-access-7dt9m\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.800718 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-log-httpd\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.801020 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-run-httpd\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.811741 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-scripts\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.813677 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-config\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.826079 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-combined-ca-bundle\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.827983 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-config-data\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.829556 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.835789 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggft\" (UniqueName: \"kubernetes.io/projected/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-kube-api-access-5ggft\") pod \"neutron-db-sync-cnrgx\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.836350 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.837868 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dt9m\" (UniqueName: \"kubernetes.io/projected/9ea43de1-1aef-4b87-a8e4-793ea2029687-kube-api-access-7dt9m\") pod \"ceilometer-0\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.865768 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v65m9"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.875037 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-g4zjc"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.876886 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.886078 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.886241 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lmncm" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.891441 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g4zjc"] Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.893822 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907120 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-db-sync-config-data\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907174 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907203 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-scripts\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907222 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-config\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907242 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-etc-machine-id\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907257 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-config-data\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907295 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907316 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-config-data\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907339 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907357 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-combined-ca-bundle\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907391 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q642v\" (UniqueName: \"kubernetes.io/projected/0752436b-39dd-4cae-87a4-1b51901ad71f-kube-api-access-q642v\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907409 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f8d025-21f1-4e23-9e6a-75cf2202e447-logs\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907432 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-scripts\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907452 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-combined-ca-bundle\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907494 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4l2f\" (UniqueName: \"kubernetes.io/projected/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-kube-api-access-c4l2f\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.907511 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtzx4\" (UniqueName: \"kubernetes.io/projected/17f8d025-21f1-4e23-9e6a-75cf2202e447-kube-api-access-jtzx4\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.908422 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-etc-machine-id\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.912203 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-db-sync-config-data\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.912498 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-scripts\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.917659 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.922404 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-combined-ca-bundle\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.924396 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-config-data\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:19 crc kubenswrapper[4666]: I1203 12:45:19.948283 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4l2f\" (UniqueName: \"kubernetes.io/projected/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-kube-api-access-c4l2f\") pod \"cinder-db-sync-vrrxv\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.008514 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.008564 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-config\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.008595 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-kube-api-access-gvg65\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.008652 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.008679 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-config-data\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009058 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009135 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q642v\" (UniqueName: \"kubernetes.io/projected/0752436b-39dd-4cae-87a4-1b51901ad71f-kube-api-access-q642v\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009164 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f8d025-21f1-4e23-9e6a-75cf2202e447-logs\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009196 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-scripts\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009224 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-combined-ca-bundle\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009257 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-db-sync-config-data\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009300 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtzx4\" (UniqueName: \"kubernetes.io/projected/17f8d025-21f1-4e23-9e6a-75cf2202e447-kube-api-access-jtzx4\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009323 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-combined-ca-bundle\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.009722 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-config\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.010348 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.011985 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.012890 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.014683 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-scripts\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.014953 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f8d025-21f1-4e23-9e6a-75cf2202e447-logs\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.016886 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-combined-ca-bundle\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.019718 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-config-data\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.040693 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.052976 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtzx4\" (UniqueName: \"kubernetes.io/projected/17f8d025-21f1-4e23-9e6a-75cf2202e447-kube-api-access-jtzx4\") pod \"placement-db-sync-v65m9\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.062981 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q642v\" (UniqueName: \"kubernetes.io/projected/0752436b-39dd-4cae-87a4-1b51901ad71f-kube-api-access-q642v\") pod \"dnsmasq-dns-7987f74bbc-pp2z9\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.076473 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.120976 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-db-sync-config-data\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.121063 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-combined-ca-bundle\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.121144 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-kube-api-access-gvg65\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.135814 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-combined-ca-bundle\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.138880 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-db-sync-config-data\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.139265 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.148266 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-kube-api-access-gvg65\") pod \"barbican-db-sync-g4zjc\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.220709 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.255661 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9vvzq"] Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.317371 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-nhlzv"] Dec 03 12:45:20 crc kubenswrapper[4666]: I1203 12:45:20.690113 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cnrgx"] Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.574895 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" event={"ID":"e2b2c19d-119a-4ebb-b708-100f6eee79c3","Type":"ContainerStarted","Data":"209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88"} Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.575215 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" event={"ID":"e2b2c19d-119a-4ebb-b708-100f6eee79c3","Type":"ContainerStarted","Data":"0998ac627226b448718a9629718207418217afc9af7eb52506731a1794676156"} Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.575341 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" podUID="e2b2c19d-119a-4ebb-b708-100f6eee79c3" containerName="init" containerID="cri-o://209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88" gracePeriod=10 Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.580224 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g4zjc"] Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.586888 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cnrgx" event={"ID":"3fc7cb74-0c27-4786-93fc-31c0e3a565b7","Type":"ContainerStarted","Data":"94a3b124e22c2b36536f1647f851a7add8e2b2e1bc9c0880551fcceb41231ed3"} Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.595261 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-pp2z9"] Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.599834 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" podUID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerName="dnsmasq-dns" containerID="cri-o://341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd" gracePeriod=10 Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.600854 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vvzq" event={"ID":"1450736e-07b1-4cdd-b5da-8026bda84463","Type":"ContainerStarted","Data":"f301d3cd2c534cc6662de316381561e2d6de6a79f6fbd6a660f684cda67c58e4"} Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.600890 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vvzq" event={"ID":"1450736e-07b1-4cdd-b5da-8026bda84463","Type":"ContainerStarted","Data":"321e8e705066603f6bdcdb63c1d970243006a17386c81dfa0e795ce9e73fbb89"} Dec 03 12:45:21 crc kubenswrapper[4666]: W1203 12:45:21.640701 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ac4b252_1187_43c2_bfdd_0d48db4ff9e8.slice/crio-26df360361374ad95012c14c99520dcf3f7325dc5f5d61ce6441b15642fb07fd WatchSource:0}: Error finding container 26df360361374ad95012c14c99520dcf3f7325dc5f5d61ce6441b15642fb07fd: Status 404 returned error can't find the container with id 26df360361374ad95012c14c99520dcf3f7325dc5f5d61ce6441b15642fb07fd Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.666532 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v65m9"] Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.691049 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vrrxv"] Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.709071 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.729670 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9vvzq" podStartSLOduration=2.729655382 podStartE2EDuration="2.729655382s" podCreationTimestamp="2025-12-03 12:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:21.681293476 +0000 UTC m=+1910.526254547" watchObservedRunningTime="2025-12-03 12:45:21.729655382 +0000 UTC m=+1910.574616433" Dec 03 12:45:21 crc kubenswrapper[4666]: I1203 12:45:21.978785 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.030509 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-nb\") pod \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.030546 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-sb\") pod \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.030573 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq5t6\" (UniqueName: \"kubernetes.io/projected/e2b2c19d-119a-4ebb-b708-100f6eee79c3-kube-api-access-rq5t6\") pod \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.030630 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-config\") pod \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.030650 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-dns-svc\") pod \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\" (UID: \"e2b2c19d-119a-4ebb-b708-100f6eee79c3\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.073359 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b2c19d-119a-4ebb-b708-100f6eee79c3-kube-api-access-rq5t6" (OuterVolumeSpecName: "kube-api-access-rq5t6") pod "e2b2c19d-119a-4ebb-b708-100f6eee79c3" (UID: "e2b2c19d-119a-4ebb-b708-100f6eee79c3"). InnerVolumeSpecName "kube-api-access-rq5t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.073855 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2b2c19d-119a-4ebb-b708-100f6eee79c3" (UID: "e2b2c19d-119a-4ebb-b708-100f6eee79c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.074533 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2b2c19d-119a-4ebb-b708-100f6eee79c3" (UID: "e2b2c19d-119a-4ebb-b708-100f6eee79c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.083000 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.086740 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2b2c19d-119a-4ebb-b708-100f6eee79c3" (UID: "e2b2c19d-119a-4ebb-b708-100f6eee79c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.088853 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-config" (OuterVolumeSpecName: "config") pod "e2b2c19d-119a-4ebb-b708-100f6eee79c3" (UID: "e2b2c19d-119a-4ebb-b708-100f6eee79c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.131958 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-config\") pod \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132050 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-dns-svc\") pod \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132077 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-nb\") pod \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132171 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-sb\") pod \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132195 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpc25\" (UniqueName: \"kubernetes.io/projected/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-kube-api-access-mpc25\") pod \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\" (UID: \"2642aa0c-c8c1-40f5-93bd-e4ffae6064de\") " Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132538 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132556 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132565 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132574 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b2c19d-119a-4ebb-b708-100f6eee79c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.132583 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq5t6\" (UniqueName: \"kubernetes.io/projected/e2b2c19d-119a-4ebb-b708-100f6eee79c3-kube-api-access-rq5t6\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.137384 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-kube-api-access-mpc25" (OuterVolumeSpecName: "kube-api-access-mpc25") pod "2642aa0c-c8c1-40f5-93bd-e4ffae6064de" (UID: "2642aa0c-c8c1-40f5-93bd-e4ffae6064de"). InnerVolumeSpecName "kube-api-access-mpc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.178797 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-config" (OuterVolumeSpecName: "config") pod "2642aa0c-c8c1-40f5-93bd-e4ffae6064de" (UID: "2642aa0c-c8c1-40f5-93bd-e4ffae6064de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.185010 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2642aa0c-c8c1-40f5-93bd-e4ffae6064de" (UID: "2642aa0c-c8c1-40f5-93bd-e4ffae6064de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.186570 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2642aa0c-c8c1-40f5-93bd-e4ffae6064de" (UID: "2642aa0c-c8c1-40f5-93bd-e4ffae6064de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.200552 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2642aa0c-c8c1-40f5-93bd-e4ffae6064de" (UID: "2642aa0c-c8c1-40f5-93bd-e4ffae6064de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.233966 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.234006 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.234020 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.234034 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.234047 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpc25\" (UniqueName: \"kubernetes.io/projected/2642aa0c-c8c1-40f5-93bd-e4ffae6064de-kube-api-access-mpc25\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.649345 4666 generic.go:334] "Generic (PLEG): container finished" podID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerID="4568f25f6866e87575936ec8181dc91d542f257b2853ef2674c5147713585ebd" exitCode=0 Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.649414 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" event={"ID":"0752436b-39dd-4cae-87a4-1b51901ad71f","Type":"ContainerDied","Data":"4568f25f6866e87575936ec8181dc91d542f257b2853ef2674c5147713585ebd"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.649440 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" event={"ID":"0752436b-39dd-4cae-87a4-1b51901ad71f","Type":"ContainerStarted","Data":"9b7c5f238870acef15e990e94dcc7e143a2827531470c3179a1f5f22c3514eb0"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.678953 4666 generic.go:334] "Generic (PLEG): container finished" podID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerID="341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd" exitCode=0 Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.679042 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" event={"ID":"2642aa0c-c8c1-40f5-93bd-e4ffae6064de","Type":"ContainerDied","Data":"341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.679069 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" event={"ID":"2642aa0c-c8c1-40f5-93bd-e4ffae6064de","Type":"ContainerDied","Data":"12567a21492c979a40ce19ceae732232b4c276bca497707a907fc66b344d5fd4"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.679101 4666 scope.go:117] "RemoveContainer" containerID="341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.679233 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-2rmm6" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.694302 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g4zjc" event={"ID":"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8","Type":"ContainerStarted","Data":"26df360361374ad95012c14c99520dcf3f7325dc5f5d61ce6441b15642fb07fd"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.702337 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v65m9" event={"ID":"17f8d025-21f1-4e23-9e6a-75cf2202e447","Type":"ContainerStarted","Data":"aa15b04462ffc18ebce3dcc352fb5394b04accbca09716c14f540f0fc35ef582"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.707070 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.708237 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrrxv" event={"ID":"9aac4d80-7d0d-4037-a398-6a28ab35d1c9","Type":"ContainerStarted","Data":"999ece3b77aa113ab5effe7978ad4f4730d08d33ed404e61b436ff478d1db290"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.710829 4666 generic.go:334] "Generic (PLEG): container finished" podID="e2b2c19d-119a-4ebb-b708-100f6eee79c3" containerID="209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88" exitCode=0 Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.710888 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" event={"ID":"e2b2c19d-119a-4ebb-b708-100f6eee79c3","Type":"ContainerDied","Data":"209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.710905 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" event={"ID":"e2b2c19d-119a-4ebb-b708-100f6eee79c3","Type":"ContainerDied","Data":"0998ac627226b448718a9629718207418217afc9af7eb52506731a1794676156"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.710987 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-nhlzv" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.721367 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cnrgx" event={"ID":"3fc7cb74-0c27-4786-93fc-31c0e3a565b7","Type":"ContainerStarted","Data":"042a9505d607d6be8f5cae0985be852eb4b16770e8e2dce82be6d90c7dcc62ef"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.730064 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerStarted","Data":"145df2e61fc81d7a260a88aa4cb98d17fddd073cf9d5d319a194585427425165"} Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.756475 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cnrgx" podStartSLOduration=3.756453659 podStartE2EDuration="3.756453659s" podCreationTimestamp="2025-12-03 12:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:22.742955284 +0000 UTC m=+1911.587916345" watchObservedRunningTime="2025-12-03 12:45:22.756453659 +0000 UTC m=+1911.601414730" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.780150 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-2rmm6"] Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.783495 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-2rmm6"] Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.803647 4666 scope.go:117] "RemoveContainer" containerID="5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.831679 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-nhlzv"] Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.842356 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-nhlzv"] Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.873213 4666 scope.go:117] "RemoveContainer" containerID="341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd" Dec 03 12:45:22 crc kubenswrapper[4666]: E1203 12:45:22.877416 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd\": container with ID starting with 341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd not found: ID does not exist" containerID="341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.877461 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd"} err="failed to get container status \"341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd\": rpc error: code = NotFound desc = could not find container \"341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd\": container with ID starting with 341dcb4259be35ea9edb06d457362ab8cf0400c1ca93cf6caa8222cfa31879bd not found: ID does not exist" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.877490 4666 scope.go:117] "RemoveContainer" containerID="5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310" Dec 03 12:45:22 crc kubenswrapper[4666]: E1203 12:45:22.879806 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310\": container with ID starting with 5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310 not found: ID does not exist" containerID="5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.879832 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310"} err="failed to get container status \"5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310\": rpc error: code = NotFound desc = could not find container \"5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310\": container with ID starting with 5fd077b5bcff4364c443d0c4a0c1dfa6afe5059f6afda22727aa164cf9fd1310 not found: ID does not exist" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.879847 4666 scope.go:117] "RemoveContainer" containerID="209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.970254 4666 scope.go:117] "RemoveContainer" containerID="209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88" Dec 03 12:45:22 crc kubenswrapper[4666]: E1203 12:45:22.976043 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88\": container with ID starting with 209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88 not found: ID does not exist" containerID="209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88" Dec 03 12:45:22 crc kubenswrapper[4666]: I1203 12:45:22.976098 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88"} err="failed to get container status \"209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88\": rpc error: code = NotFound desc = could not find container \"209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88\": container with ID starting with 209c140c4ff84be578fb1952d15b7538264b66cea2c26bda667513859d2e6d88 not found: ID does not exist" Dec 03 12:45:23 crc kubenswrapper[4666]: I1203 12:45:23.433626 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" path="/var/lib/kubelet/pods/2642aa0c-c8c1-40f5-93bd-e4ffae6064de/volumes" Dec 03 12:45:23 crc kubenswrapper[4666]: I1203 12:45:23.434220 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b2c19d-119a-4ebb-b708-100f6eee79c3" path="/var/lib/kubelet/pods/e2b2c19d-119a-4ebb-b708-100f6eee79c3/volumes" Dec 03 12:45:23 crc kubenswrapper[4666]: I1203 12:45:23.761250 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" event={"ID":"0752436b-39dd-4cae-87a4-1b51901ad71f","Type":"ContainerStarted","Data":"7aa4033ec778a65011d4cf5e80f67cca6cdc2f9afab2ef8784445ba07595c1e8"} Dec 03 12:45:23 crc kubenswrapper[4666]: I1203 12:45:23.762242 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:23 crc kubenswrapper[4666]: I1203 12:45:23.793547 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" podStartSLOduration=4.793528712 podStartE2EDuration="4.793528712s" podCreationTimestamp="2025-12-03 12:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:23.78936404 +0000 UTC m=+1912.634325091" watchObservedRunningTime="2025-12-03 12:45:23.793528712 +0000 UTC m=+1912.638489763" Dec 03 12:45:25 crc kubenswrapper[4666]: I1203 12:45:25.787204 4666 generic.go:334] "Generic (PLEG): container finished" podID="1450736e-07b1-4cdd-b5da-8026bda84463" containerID="f301d3cd2c534cc6662de316381561e2d6de6a79f6fbd6a660f684cda67c58e4" exitCode=0 Dec 03 12:45:25 crc kubenswrapper[4666]: I1203 12:45:25.788399 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vvzq" event={"ID":"1450736e-07b1-4cdd-b5da-8026bda84463","Type":"ContainerDied","Data":"f301d3cd2c534cc6662de316381561e2d6de6a79f6fbd6a660f684cda67c58e4"} Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.230686 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.367162 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-config-data\") pod \"1450736e-07b1-4cdd-b5da-8026bda84463\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.367259 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-scripts\") pod \"1450736e-07b1-4cdd-b5da-8026bda84463\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.367368 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-combined-ca-bundle\") pod \"1450736e-07b1-4cdd-b5da-8026bda84463\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.367396 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-credential-keys\") pod \"1450736e-07b1-4cdd-b5da-8026bda84463\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.367424 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-fernet-keys\") pod \"1450736e-07b1-4cdd-b5da-8026bda84463\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.367448 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6x85\" (UniqueName: \"kubernetes.io/projected/1450736e-07b1-4cdd-b5da-8026bda84463-kube-api-access-g6x85\") pod \"1450736e-07b1-4cdd-b5da-8026bda84463\" (UID: \"1450736e-07b1-4cdd-b5da-8026bda84463\") " Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.374444 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-scripts" (OuterVolumeSpecName: "scripts") pod "1450736e-07b1-4cdd-b5da-8026bda84463" (UID: "1450736e-07b1-4cdd-b5da-8026bda84463"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.374616 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1450736e-07b1-4cdd-b5da-8026bda84463" (UID: "1450736e-07b1-4cdd-b5da-8026bda84463"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.375649 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1450736e-07b1-4cdd-b5da-8026bda84463" (UID: "1450736e-07b1-4cdd-b5da-8026bda84463"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.377175 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1450736e-07b1-4cdd-b5da-8026bda84463-kube-api-access-g6x85" (OuterVolumeSpecName: "kube-api-access-g6x85") pod "1450736e-07b1-4cdd-b5da-8026bda84463" (UID: "1450736e-07b1-4cdd-b5da-8026bda84463"). InnerVolumeSpecName "kube-api-access-g6x85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.395380 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-config-data" (OuterVolumeSpecName: "config-data") pod "1450736e-07b1-4cdd-b5da-8026bda84463" (UID: "1450736e-07b1-4cdd-b5da-8026bda84463"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.396912 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1450736e-07b1-4cdd-b5da-8026bda84463" (UID: "1450736e-07b1-4cdd-b5da-8026bda84463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.469753 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.469808 4666 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.469831 4666 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.469852 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6x85\" (UniqueName: \"kubernetes.io/projected/1450736e-07b1-4cdd-b5da-8026bda84463-kube-api-access-g6x85\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.469872 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.469890 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1450736e-07b1-4cdd-b5da-8026bda84463-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.815707 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vvzq" event={"ID":"1450736e-07b1-4cdd-b5da-8026bda84463","Type":"ContainerDied","Data":"321e8e705066603f6bdcdb63c1d970243006a17386c81dfa0e795ce9e73fbb89"} Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.815752 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="321e8e705066603f6bdcdb63c1d970243006a17386c81dfa0e795ce9e73fbb89" Dec 03 12:45:28 crc kubenswrapper[4666]: I1203 12:45:28.815764 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vvzq" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.334831 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9vvzq"] Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.342904 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9vvzq"] Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.422575 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2lngb"] Dec 03 12:45:29 crc kubenswrapper[4666]: E1203 12:45:29.423175 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerName="init" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.423204 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerName="init" Dec 03 12:45:29 crc kubenswrapper[4666]: E1203 12:45:29.423252 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerName="dnsmasq-dns" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.423264 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerName="dnsmasq-dns" Dec 03 12:45:29 crc kubenswrapper[4666]: E1203 12:45:29.423294 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b2c19d-119a-4ebb-b708-100f6eee79c3" containerName="init" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.423306 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b2c19d-119a-4ebb-b708-100f6eee79c3" containerName="init" Dec 03 12:45:29 crc kubenswrapper[4666]: E1203 12:45:29.423329 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1450736e-07b1-4cdd-b5da-8026bda84463" containerName="keystone-bootstrap" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.423341 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1450736e-07b1-4cdd-b5da-8026bda84463" containerName="keystone-bootstrap" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.423613 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1450736e-07b1-4cdd-b5da-8026bda84463" containerName="keystone-bootstrap" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.423655 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b2c19d-119a-4ebb-b708-100f6eee79c3" containerName="init" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.423677 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="2642aa0c-c8c1-40f5-93bd-e4ffae6064de" containerName="dnsmasq-dns" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.424625 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.427786 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.427963 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l82fb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.428154 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.428406 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.428569 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.456430 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1450736e-07b1-4cdd-b5da-8026bda84463" path="/var/lib/kubelet/pods/1450736e-07b1-4cdd-b5da-8026bda84463/volumes" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.457133 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2lngb"] Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.460803 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:45:29 crc kubenswrapper[4666]: E1203 12:45:29.461265 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.586777 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-config-data\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.586852 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-combined-ca-bundle\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.586873 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-credential-keys\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.586920 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-fernet-keys\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.586940 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj7wv\" (UniqueName: \"kubernetes.io/projected/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-kube-api-access-zj7wv\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.587097 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-scripts\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.688384 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-scripts\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.688466 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-config-data\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.688494 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-combined-ca-bundle\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.688509 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-credential-keys\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.688537 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-fernet-keys\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.688557 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj7wv\" (UniqueName: \"kubernetes.io/projected/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-kube-api-access-zj7wv\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.694511 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-config-data\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.694969 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-combined-ca-bundle\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.695259 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-credential-keys\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.695484 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-fernet-keys\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.709719 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-scripts\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.714687 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj7wv\" (UniqueName: \"kubernetes.io/projected/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-kube-api-access-zj7wv\") pod \"keystone-bootstrap-2lngb\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:29 crc kubenswrapper[4666]: I1203 12:45:29.770301 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:30 crc kubenswrapper[4666]: I1203 12:45:30.078198 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:45:30 crc kubenswrapper[4666]: I1203 12:45:30.138140 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-452xr"] Dec 03 12:45:30 crc kubenswrapper[4666]: I1203 12:45:30.138428 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="dnsmasq-dns" containerID="cri-o://c5c8364662d2aebcba8d5b3979f64fd1000e8c23732dbf7831ae8e92355f840a" gracePeriod=10 Dec 03 12:45:31 crc kubenswrapper[4666]: I1203 12:45:31.839827 4666 generic.go:334] "Generic (PLEG): container finished" podID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerID="c5c8364662d2aebcba8d5b3979f64fd1000e8c23732dbf7831ae8e92355f840a" exitCode=0 Dec 03 12:45:31 crc kubenswrapper[4666]: I1203 12:45:31.839904 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" event={"ID":"bd2bc7f2-09f4-46d1-8640-183260d1ccb8","Type":"ContainerDied","Data":"c5c8364662d2aebcba8d5b3979f64fd1000e8c23732dbf7831ae8e92355f840a"} Dec 03 12:45:32 crc kubenswrapper[4666]: I1203 12:45:32.065677 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Dec 03 12:45:37 crc kubenswrapper[4666]: I1203 12:45:37.065062 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Dec 03 12:45:40 crc kubenswrapper[4666]: I1203 12:45:40.424318 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:45:40 crc kubenswrapper[4666]: E1203 12:45:40.425033 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:45:42 crc kubenswrapper[4666]: I1203 12:45:42.065395 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Dec 03 12:45:42 crc kubenswrapper[4666]: I1203 12:45:42.066026 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:45:45 crc kubenswrapper[4666]: E1203 12:45:45.933185 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 12:45:45 crc kubenswrapper[4666]: E1203 12:45:45.934035 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c4l2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vrrxv_openstack(9aac4d80-7d0d-4037-a398-6a28ab35d1c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:45:45 crc kubenswrapper[4666]: E1203 12:45:45.935329 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vrrxv" podUID="9aac4d80-7d0d-4037-a398-6a28ab35d1c9" Dec 03 12:45:45 crc kubenswrapper[4666]: E1203 12:45:45.981690 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vrrxv" podUID="9aac4d80-7d0d-4037-a398-6a28ab35d1c9" Dec 03 12:45:46 crc kubenswrapper[4666]: E1203 12:45:46.380245 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 03 12:45:46 crc kubenswrapper[4666]: E1203 12:45:46.380397 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gvg65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-g4zjc_openstack(7ac4b252-1187-43c2-bfdd-0d48db4ff9e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:45:46 crc kubenswrapper[4666]: E1203 12:45:46.382900 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-g4zjc" podUID="7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.597004 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2lngb"] Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.707856 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.709419 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.799285 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-nb\") pod \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.799733 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-sb\") pod \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.799927 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-dns-svc\") pod \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.800559 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pw9n\" (UniqueName: \"kubernetes.io/projected/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-kube-api-access-2pw9n\") pod \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.800737 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-config\") pod \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\" (UID: \"bd2bc7f2-09f4-46d1-8640-183260d1ccb8\") " Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.816239 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-kube-api-access-2pw9n" (OuterVolumeSpecName: "kube-api-access-2pw9n") pod "bd2bc7f2-09f4-46d1-8640-183260d1ccb8" (UID: "bd2bc7f2-09f4-46d1-8640-183260d1ccb8"). InnerVolumeSpecName "kube-api-access-2pw9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.902472 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pw9n\" (UniqueName: \"kubernetes.io/projected/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-kube-api-access-2pw9n\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.945499 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-config" (OuterVolumeSpecName: "config") pod "bd2bc7f2-09f4-46d1-8640-183260d1ccb8" (UID: "bd2bc7f2-09f4-46d1-8640-183260d1ccb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.965505 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd2bc7f2-09f4-46d1-8640-183260d1ccb8" (UID: "bd2bc7f2-09f4-46d1-8640-183260d1ccb8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.977814 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd2bc7f2-09f4-46d1-8640-183260d1ccb8" (UID: "bd2bc7f2-09f4-46d1-8640-183260d1ccb8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.990499 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd2bc7f2-09f4-46d1-8640-183260d1ccb8" (UID: "bd2bc7f2-09f4-46d1-8640-183260d1ccb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.998282 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" event={"ID":"bd2bc7f2-09f4-46d1-8640-183260d1ccb8","Type":"ContainerDied","Data":"4195758383b42be222c809edc2b05500cea3ad929772eb4062e4675a467d41a4"} Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.998497 4666 scope.go:117] "RemoveContainer" containerID="c5c8364662d2aebcba8d5b3979f64fd1000e8c23732dbf7831ae8e92355f840a" Dec 03 12:45:46 crc kubenswrapper[4666]: I1203 12:45:46.998469 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-452xr" Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.003800 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.003828 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.003839 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.003847 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2bc7f2-09f4-46d1-8640-183260d1ccb8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.008304 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2lngb" event={"ID":"4bd8e823-c02f-4841-a07a-a1e7fa64f75e","Type":"ContainerStarted","Data":"856e3ad0b8953140a1d3e750dcefb736fc93897429fa30b19feeb9bc025e5da1"} Dec 03 12:45:47 crc kubenswrapper[4666]: E1203 12:45:47.008918 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-g4zjc" podUID="7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.031024 4666 scope.go:117] "RemoveContainer" containerID="d709e0ec0cae2052756127a4a3343cc8dac3fcdf3469c2dd5ddbdd52d4e0c4e9" Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.061177 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-452xr"] Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.069641 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-452xr"] Dec 03 12:45:47 crc kubenswrapper[4666]: I1203 12:45:47.435857 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" path="/var/lib/kubelet/pods/bd2bc7f2-09f4-46d1-8640-183260d1ccb8/volumes" Dec 03 12:45:48 crc kubenswrapper[4666]: I1203 12:45:48.024339 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v65m9" event={"ID":"17f8d025-21f1-4e23-9e6a-75cf2202e447","Type":"ContainerStarted","Data":"640915585d72b35a44c9ff49461b6510dae0d6ec5dd2cbc7b9cdfe641a03b825"} Dec 03 12:45:48 crc kubenswrapper[4666]: I1203 12:45:48.034076 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerStarted","Data":"82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582"} Dec 03 12:45:48 crc kubenswrapper[4666]: I1203 12:45:48.037542 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2lngb" event={"ID":"4bd8e823-c02f-4841-a07a-a1e7fa64f75e","Type":"ContainerStarted","Data":"3773289bcf1b2a4bdd8e616a70153c6c877a5cef9abe269603bed77f84330e0d"} Dec 03 12:45:48 crc kubenswrapper[4666]: I1203 12:45:48.049333 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-v65m9" podStartSLOduration=4.341500802 podStartE2EDuration="29.049296646s" podCreationTimestamp="2025-12-03 12:45:19 +0000 UTC" firstStartedPulling="2025-12-03 12:45:21.685581811 +0000 UTC m=+1910.530542862" lastFinishedPulling="2025-12-03 12:45:46.393377645 +0000 UTC m=+1935.238338706" observedRunningTime="2025-12-03 12:45:48.039484341 +0000 UTC m=+1936.884445392" watchObservedRunningTime="2025-12-03 12:45:48.049296646 +0000 UTC m=+1936.894257707" Dec 03 12:45:48 crc kubenswrapper[4666]: I1203 12:45:48.077371 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2lngb" podStartSLOduration=19.077345234 podStartE2EDuration="19.077345234s" podCreationTimestamp="2025-12-03 12:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:48.070787367 +0000 UTC m=+1936.915748418" watchObservedRunningTime="2025-12-03 12:45:48.077345234 +0000 UTC m=+1936.922306305" Dec 03 12:45:49 crc kubenswrapper[4666]: I1203 12:45:49.087191 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerStarted","Data":"fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010"} Dec 03 12:45:50 crc kubenswrapper[4666]: I1203 12:45:50.097535 4666 generic.go:334] "Generic (PLEG): container finished" podID="17f8d025-21f1-4e23-9e6a-75cf2202e447" containerID="640915585d72b35a44c9ff49461b6510dae0d6ec5dd2cbc7b9cdfe641a03b825" exitCode=0 Dec 03 12:45:50 crc kubenswrapper[4666]: I1203 12:45:50.097611 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v65m9" event={"ID":"17f8d025-21f1-4e23-9e6a-75cf2202e447","Type":"ContainerDied","Data":"640915585d72b35a44c9ff49461b6510dae0d6ec5dd2cbc7b9cdfe641a03b825"} Dec 03 12:45:50 crc kubenswrapper[4666]: I1203 12:45:50.101418 4666 generic.go:334] "Generic (PLEG): container finished" podID="4bd8e823-c02f-4841-a07a-a1e7fa64f75e" containerID="3773289bcf1b2a4bdd8e616a70153c6c877a5cef9abe269603bed77f84330e0d" exitCode=0 Dec 03 12:45:50 crc kubenswrapper[4666]: I1203 12:45:50.101455 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2lngb" event={"ID":"4bd8e823-c02f-4841-a07a-a1e7fa64f75e","Type":"ContainerDied","Data":"3773289bcf1b2a4bdd8e616a70153c6c877a5cef9abe269603bed77f84330e0d"} Dec 03 12:45:51 crc kubenswrapper[4666]: I1203 12:45:51.440684 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:45:51 crc kubenswrapper[4666]: E1203 12:45:51.441148 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.842534 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.845454 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.917998 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-scripts\") pod \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918138 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-credential-keys\") pod \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918165 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-config-data\") pod \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918194 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-config-data\") pod \"17f8d025-21f1-4e23-9e6a-75cf2202e447\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918260 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj7wv\" (UniqueName: \"kubernetes.io/projected/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-kube-api-access-zj7wv\") pod \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918317 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-scripts\") pod \"17f8d025-21f1-4e23-9e6a-75cf2202e447\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918337 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f8d025-21f1-4e23-9e6a-75cf2202e447-logs\") pod \"17f8d025-21f1-4e23-9e6a-75cf2202e447\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918389 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtzx4\" (UniqueName: \"kubernetes.io/projected/17f8d025-21f1-4e23-9e6a-75cf2202e447-kube-api-access-jtzx4\") pod \"17f8d025-21f1-4e23-9e6a-75cf2202e447\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918419 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-combined-ca-bundle\") pod \"17f8d025-21f1-4e23-9e6a-75cf2202e447\" (UID: \"17f8d025-21f1-4e23-9e6a-75cf2202e447\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918453 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-fernet-keys\") pod \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.918549 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-combined-ca-bundle\") pod \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\" (UID: \"4bd8e823-c02f-4841-a07a-a1e7fa64f75e\") " Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.919393 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f8d025-21f1-4e23-9e6a-75cf2202e447-logs" (OuterVolumeSpecName: "logs") pod "17f8d025-21f1-4e23-9e6a-75cf2202e447" (UID: "17f8d025-21f1-4e23-9e6a-75cf2202e447"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.923210 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4bd8e823-c02f-4841-a07a-a1e7fa64f75e" (UID: "4bd8e823-c02f-4841-a07a-a1e7fa64f75e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.923248 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f8d025-21f1-4e23-9e6a-75cf2202e447-kube-api-access-jtzx4" (OuterVolumeSpecName: "kube-api-access-jtzx4") pod "17f8d025-21f1-4e23-9e6a-75cf2202e447" (UID: "17f8d025-21f1-4e23-9e6a-75cf2202e447"). InnerVolumeSpecName "kube-api-access-jtzx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.923263 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4bd8e823-c02f-4841-a07a-a1e7fa64f75e" (UID: "4bd8e823-c02f-4841-a07a-a1e7fa64f75e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.923317 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-scripts" (OuterVolumeSpecName: "scripts") pod "4bd8e823-c02f-4841-a07a-a1e7fa64f75e" (UID: "4bd8e823-c02f-4841-a07a-a1e7fa64f75e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.923724 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-scripts" (OuterVolumeSpecName: "scripts") pod "17f8d025-21f1-4e23-9e6a-75cf2202e447" (UID: "17f8d025-21f1-4e23-9e6a-75cf2202e447"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.923779 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-kube-api-access-zj7wv" (OuterVolumeSpecName: "kube-api-access-zj7wv") pod "4bd8e823-c02f-4841-a07a-a1e7fa64f75e" (UID: "4bd8e823-c02f-4841-a07a-a1e7fa64f75e"). InnerVolumeSpecName "kube-api-access-zj7wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.953466 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17f8d025-21f1-4e23-9e6a-75cf2202e447" (UID: "17f8d025-21f1-4e23-9e6a-75cf2202e447"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.956918 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-config-data" (OuterVolumeSpecName: "config-data") pod "17f8d025-21f1-4e23-9e6a-75cf2202e447" (UID: "17f8d025-21f1-4e23-9e6a-75cf2202e447"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.957634 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-config-data" (OuterVolumeSpecName: "config-data") pod "4bd8e823-c02f-4841-a07a-a1e7fa64f75e" (UID: "4bd8e823-c02f-4841-a07a-a1e7fa64f75e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:52 crc kubenswrapper[4666]: I1203 12:45:52.959313 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bd8e823-c02f-4841-a07a-a1e7fa64f75e" (UID: "4bd8e823-c02f-4841-a07a-a1e7fa64f75e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021189 4666 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021236 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021254 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021272 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj7wv\" (UniqueName: \"kubernetes.io/projected/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-kube-api-access-zj7wv\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021291 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021307 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f8d025-21f1-4e23-9e6a-75cf2202e447-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021325 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtzx4\" (UniqueName: \"kubernetes.io/projected/17f8d025-21f1-4e23-9e6a-75cf2202e447-kube-api-access-jtzx4\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021343 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f8d025-21f1-4e23-9e6a-75cf2202e447-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021359 4666 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021374 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.021390 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd8e823-c02f-4841-a07a-a1e7fa64f75e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.127924 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerStarted","Data":"dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0"} Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.130173 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2lngb" event={"ID":"4bd8e823-c02f-4841-a07a-a1e7fa64f75e","Type":"ContainerDied","Data":"856e3ad0b8953140a1d3e750dcefb736fc93897429fa30b19feeb9bc025e5da1"} Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.130197 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856e3ad0b8953140a1d3e750dcefb736fc93897429fa30b19feeb9bc025e5da1" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.130241 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2lngb" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.141920 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v65m9" event={"ID":"17f8d025-21f1-4e23-9e6a-75cf2202e447","Type":"ContainerDied","Data":"aa15b04462ffc18ebce3dcc352fb5394b04accbca09716c14f540f0fc35ef582"} Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.142051 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa15b04462ffc18ebce3dcc352fb5394b04accbca09716c14f540f0fc35ef582" Dec 03 12:45:53 crc kubenswrapper[4666]: I1203 12:45:53.142032 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v65m9" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.002634 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9676dbdb4-pcj6f"] Dec 03 12:45:54 crc kubenswrapper[4666]: E1203 12:45:54.004117 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f8d025-21f1-4e23-9e6a-75cf2202e447" containerName="placement-db-sync" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.004278 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f8d025-21f1-4e23-9e6a-75cf2202e447" containerName="placement-db-sync" Dec 03 12:45:54 crc kubenswrapper[4666]: E1203 12:45:54.004419 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="dnsmasq-dns" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.004537 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="dnsmasq-dns" Dec 03 12:45:54 crc kubenswrapper[4666]: E1203 12:45:54.004684 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="init" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.004800 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="init" Dec 03 12:45:54 crc kubenswrapper[4666]: E1203 12:45:54.004919 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd8e823-c02f-4841-a07a-a1e7fa64f75e" containerName="keystone-bootstrap" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.005030 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd8e823-c02f-4841-a07a-a1e7fa64f75e" containerName="keystone-bootstrap" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.005468 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f8d025-21f1-4e23-9e6a-75cf2202e447" containerName="placement-db-sync" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.005622 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2bc7f2-09f4-46d1-8640-183260d1ccb8" containerName="dnsmasq-dns" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.005756 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd8e823-c02f-4841-a07a-a1e7fa64f75e" containerName="keystone-bootstrap" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.007073 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.010596 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9676dbdb4-pcj6f"] Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.011310 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.011567 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.024400 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.024860 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-26w9m" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.025256 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.040324 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-combined-ca-bundle\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.040416 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-config-data\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.040448 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-internal-tls-certs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.040477 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phvkd\" (UniqueName: \"kubernetes.io/projected/08966743-608f-40d3-9a26-2515ef964f0f-kube-api-access-phvkd\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.040508 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08966743-608f-40d3-9a26-2515ef964f0f-logs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.040553 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-public-tls-certs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.040588 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-scripts\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.073225 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5fdb97596b-722zc"] Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.074498 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.077494 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.077806 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.077980 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.078264 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.079253 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l82fb" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.094403 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.097870 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fdb97596b-722zc"] Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.141804 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-credential-keys\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.141849 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-fernet-keys\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.141886 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-config-data\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.141918 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-internal-tls-certs\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.141963 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-combined-ca-bundle\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142005 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-config-data\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142037 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-combined-ca-bundle\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142060 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-internal-tls-certs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142081 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phvkd\" (UniqueName: \"kubernetes.io/projected/08966743-608f-40d3-9a26-2515ef964f0f-kube-api-access-phvkd\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142129 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08966743-608f-40d3-9a26-2515ef964f0f-logs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142197 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvnkx\" (UniqueName: \"kubernetes.io/projected/fad25ce8-9656-42a7-bc6a-369e68732b1e-kube-api-access-kvnkx\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142218 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-public-tls-certs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142235 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-scripts\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142276 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-public-tls-certs\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.142295 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-scripts\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.143522 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08966743-608f-40d3-9a26-2515ef964f0f-logs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.146299 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-config-data\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.146555 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-combined-ca-bundle\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.148646 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-scripts\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.149618 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-internal-tls-certs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.149748 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08966743-608f-40d3-9a26-2515ef964f0f-public-tls-certs\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.158589 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phvkd\" (UniqueName: \"kubernetes.io/projected/08966743-608f-40d3-9a26-2515ef964f0f-kube-api-access-phvkd\") pod \"placement-9676dbdb4-pcj6f\" (UID: \"08966743-608f-40d3-9a26-2515ef964f0f\") " pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243299 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-combined-ca-bundle\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243610 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvnkx\" (UniqueName: \"kubernetes.io/projected/fad25ce8-9656-42a7-bc6a-369e68732b1e-kube-api-access-kvnkx\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243636 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-scripts\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243660 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-public-tls-certs\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243694 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-credential-keys\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243715 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-fernet-keys\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243732 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-config-data\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.243758 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-internal-tls-certs\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.247967 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-public-tls-certs\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.248011 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-combined-ca-bundle\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.248465 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-fernet-keys\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.249347 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-internal-tls-certs\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.250412 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-config-data\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.252615 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-scripts\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.255940 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fad25ce8-9656-42a7-bc6a-369e68732b1e-credential-keys\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.267766 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvnkx\" (UniqueName: \"kubernetes.io/projected/fad25ce8-9656-42a7-bc6a-369e68732b1e-kube-api-access-kvnkx\") pod \"keystone-5fdb97596b-722zc\" (UID: \"fad25ce8-9656-42a7-bc6a-369e68732b1e\") " pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.329001 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.396794 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.771920 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9676dbdb4-pcj6f"] Dec 03 12:45:54 crc kubenswrapper[4666]: I1203 12:45:54.873002 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fdb97596b-722zc"] Dec 03 12:45:54 crc kubenswrapper[4666]: W1203 12:45:54.882007 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfad25ce8_9656_42a7_bc6a_369e68732b1e.slice/crio-f370150c9ca83c2a59e2fa5118b731e8dd87383f8e5618406a620b85ce06918d WatchSource:0}: Error finding container f370150c9ca83c2a59e2fa5118b731e8dd87383f8e5618406a620b85ce06918d: Status 404 returned error can't find the container with id f370150c9ca83c2a59e2fa5118b731e8dd87383f8e5618406a620b85ce06918d Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.175320 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9676dbdb4-pcj6f" event={"ID":"08966743-608f-40d3-9a26-2515ef964f0f","Type":"ContainerStarted","Data":"92a5fb93b924bcdca404b2090a483e235f02b86ca0ec38d4b581cb765fc7029b"} Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.175646 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9676dbdb4-pcj6f" event={"ID":"08966743-608f-40d3-9a26-2515ef964f0f","Type":"ContainerStarted","Data":"92820cdbbcce6f31e6d74decc86bcb859fb75335c379d088f6406ab24e2a3603"} Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.178841 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdb97596b-722zc" event={"ID":"fad25ce8-9656-42a7-bc6a-369e68732b1e","Type":"ContainerStarted","Data":"a911ef12ffab7abb2449fd906874abba1908ceb21ac950fece783268a9f422d8"} Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.178893 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdb97596b-722zc" event={"ID":"fad25ce8-9656-42a7-bc6a-369e68732b1e","Type":"ContainerStarted","Data":"f370150c9ca83c2a59e2fa5118b731e8dd87383f8e5618406a620b85ce06918d"} Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.178931 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.190498 4666 generic.go:334] "Generic (PLEG): container finished" podID="3fc7cb74-0c27-4786-93fc-31c0e3a565b7" containerID="042a9505d607d6be8f5cae0985be852eb4b16770e8e2dce82be6d90c7dcc62ef" exitCode=0 Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.190569 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cnrgx" event={"ID":"3fc7cb74-0c27-4786-93fc-31c0e3a565b7","Type":"ContainerDied","Data":"042a9505d607d6be8f5cae0985be852eb4b16770e8e2dce82be6d90c7dcc62ef"} Dec 03 12:45:55 crc kubenswrapper[4666]: I1203 12:45:55.248079 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5fdb97596b-722zc" podStartSLOduration=1.248060344 podStartE2EDuration="1.248060344s" podCreationTimestamp="2025-12-03 12:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:55.237000515 +0000 UTC m=+1944.081961576" watchObservedRunningTime="2025-12-03 12:45:55.248060344 +0000 UTC m=+1944.093021395" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.201167 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9676dbdb4-pcj6f" event={"ID":"08966743-608f-40d3-9a26-2515ef964f0f","Type":"ContainerStarted","Data":"17dd629f5d1e6959b7bf5996fc14455080af7ffc87aa17377cfe7434da8b26c8"} Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.201686 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.201710 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.227025 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9676dbdb4-pcj6f" podStartSLOduration=3.227004927 podStartE2EDuration="3.227004927s" podCreationTimestamp="2025-12-03 12:45:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:45:56.224390037 +0000 UTC m=+1945.069351108" watchObservedRunningTime="2025-12-03 12:45:56.227004927 +0000 UTC m=+1945.071965998" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.510983 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.590454 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-config\") pod \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.590713 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-combined-ca-bundle\") pod \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.590739 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ggft\" (UniqueName: \"kubernetes.io/projected/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-kube-api-access-5ggft\") pod \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\" (UID: \"3fc7cb74-0c27-4786-93fc-31c0e3a565b7\") " Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.595041 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-kube-api-access-5ggft" (OuterVolumeSpecName: "kube-api-access-5ggft") pod "3fc7cb74-0c27-4786-93fc-31c0e3a565b7" (UID: "3fc7cb74-0c27-4786-93fc-31c0e3a565b7"). InnerVolumeSpecName "kube-api-access-5ggft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.612444 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-config" (OuterVolumeSpecName: "config") pod "3fc7cb74-0c27-4786-93fc-31c0e3a565b7" (UID: "3fc7cb74-0c27-4786-93fc-31c0e3a565b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.636437 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fc7cb74-0c27-4786-93fc-31c0e3a565b7" (UID: "3fc7cb74-0c27-4786-93fc-31c0e3a565b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.694352 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.694394 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:56 crc kubenswrapper[4666]: I1203 12:45:56.694409 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ggft\" (UniqueName: \"kubernetes.io/projected/3fc7cb74-0c27-4786-93fc-31c0e3a565b7-kube-api-access-5ggft\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.209201 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cnrgx" event={"ID":"3fc7cb74-0c27-4786-93fc-31c0e3a565b7","Type":"ContainerDied","Data":"94a3b124e22c2b36536f1647f851a7add8e2b2e1bc9c0880551fcceb41231ed3"} Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.209438 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94a3b124e22c2b36536f1647f851a7add8e2b2e1bc9c0880551fcceb41231ed3" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.209215 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cnrgx" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.496809 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-ztdnp"] Dec 03 12:45:57 crc kubenswrapper[4666]: E1203 12:45:57.497275 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc7cb74-0c27-4786-93fc-31c0e3a565b7" containerName="neutron-db-sync" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.497293 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc7cb74-0c27-4786-93fc-31c0e3a565b7" containerName="neutron-db-sync" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.497446 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc7cb74-0c27-4786-93fc-31c0e3a565b7" containerName="neutron-db-sync" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.498439 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.507336 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-ztdnp"] Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.606736 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.606849 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.606889 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7t5f\" (UniqueName: \"kubernetes.io/projected/2cf51646-afaa-4852-bd35-323fe6ab6c4b-kube-api-access-k7t5f\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.606971 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-dns-svc\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.607044 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-config\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.709730 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.709781 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7t5f\" (UniqueName: \"kubernetes.io/projected/2cf51646-afaa-4852-bd35-323fe6ab6c4b-kube-api-access-k7t5f\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.709818 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-dns-svc\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.709865 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-config\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.709902 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.711286 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.711906 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-dns-svc\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.712186 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-config\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.715107 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.729810 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7t5f\" (UniqueName: \"kubernetes.io/projected/2cf51646-afaa-4852-bd35-323fe6ab6c4b-kube-api-access-k7t5f\") pod \"dnsmasq-dns-7b946d459c-ztdnp\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.754404 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f668b5844-p5d4j"] Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.755735 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.758138 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.758891 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.759165 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.760674 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cz97w" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.780672 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f668b5844-p5d4j"] Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.811199 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-ovndb-tls-certs\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.811294 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-config\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.811350 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wt9\" (UniqueName: \"kubernetes.io/projected/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-kube-api-access-g4wt9\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.811511 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-combined-ca-bundle\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.811624 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-httpd-config\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.822202 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.912607 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wt9\" (UniqueName: \"kubernetes.io/projected/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-kube-api-access-g4wt9\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.912684 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-combined-ca-bundle\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.912737 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-httpd-config\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.912795 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-ovndb-tls-certs\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.912825 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-config\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.919235 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-httpd-config\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.919517 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-ovndb-tls-certs\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.919629 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-config\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.924947 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-combined-ca-bundle\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:57 crc kubenswrapper[4666]: I1203 12:45:57.935491 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wt9\" (UniqueName: \"kubernetes.io/projected/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-kube-api-access-g4wt9\") pod \"neutron-5f668b5844-p5d4j\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:58 crc kubenswrapper[4666]: I1203 12:45:58.091080 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.844099 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-96d8bfbbf-pd9x2"] Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.847641 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.857740 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.858383 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.874567 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-96d8bfbbf-pd9x2"] Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.955577 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-config\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.955972 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-httpd-config\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.956002 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-ovndb-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.956029 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-combined-ca-bundle\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.956060 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb6qg\" (UniqueName: \"kubernetes.io/projected/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-kube-api-access-tb6qg\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.956076 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-public-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:45:59 crc kubenswrapper[4666]: I1203 12:45:59.956125 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-internal-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.057587 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-combined-ca-bundle\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.057647 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb6qg\" (UniqueName: \"kubernetes.io/projected/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-kube-api-access-tb6qg\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.057673 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-public-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.057717 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-internal-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.057778 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-config\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.057819 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-httpd-config\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.057843 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-ovndb-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.063788 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-ovndb-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.063853 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-internal-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.065481 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-public-tls-certs\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.065717 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-combined-ca-bundle\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.066755 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-httpd-config\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.073717 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-config\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.081903 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb6qg\" (UniqueName: \"kubernetes.io/projected/c6fc5a47-ba09-4985-b0d7-26d824dd60e3-kube-api-access-tb6qg\") pod \"neutron-96d8bfbbf-pd9x2\" (UID: \"c6fc5a47-ba09-4985-b0d7-26d824dd60e3\") " pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:00 crc kubenswrapper[4666]: I1203 12:46:00.192691 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:02 crc kubenswrapper[4666]: I1203 12:46:02.424215 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:46:02 crc kubenswrapper[4666]: E1203 12:46:02.424871 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.091940 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-ztdnp"] Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.184454 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f668b5844-p5d4j"] Dec 03 12:46:04 crc kubenswrapper[4666]: W1203 12:46:04.202354 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f658bc5_7c9c_4534_8554_cb34af6b5a8b.slice/crio-819a0a5c0280ce68ee43e3704a738d8ef0a3e1750a6a7c2f1a47d90718da292b WatchSource:0}: Error finding container 819a0a5c0280ce68ee43e3704a738d8ef0a3e1750a6a7c2f1a47d90718da292b: Status 404 returned error can't find the container with id 819a0a5c0280ce68ee43e3704a738d8ef0a3e1750a6a7c2f1a47d90718da292b Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.271766 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-96d8bfbbf-pd9x2"] Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.274080 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerStarted","Data":"77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47"} Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.274230 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-central-agent" containerID="cri-o://82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582" gracePeriod=30 Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.274260 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.274389 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="proxy-httpd" containerID="cri-o://77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47" gracePeriod=30 Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.274424 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-notification-agent" containerID="cri-o://fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010" gracePeriod=30 Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.274518 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="sg-core" containerID="cri-o://dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0" gracePeriod=30 Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.278897 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g4zjc" event={"ID":"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8","Type":"ContainerStarted","Data":"2e75119f3f6d7de144f91ea4e85a9b5a8eacbacefc96ae2614ca9ca884f68951"} Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.284531 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f668b5844-p5d4j" event={"ID":"0f658bc5-7c9c-4534-8554-cb34af6b5a8b","Type":"ContainerStarted","Data":"819a0a5c0280ce68ee43e3704a738d8ef0a3e1750a6a7c2f1a47d90718da292b"} Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.286478 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" event={"ID":"2cf51646-afaa-4852-bd35-323fe6ab6c4b","Type":"ContainerStarted","Data":"62442867cbf455bda98ab82f7a74efbaae09e08e294dc322c983a0833c970385"} Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.306221 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.277777737 podStartE2EDuration="45.306203088s" podCreationTimestamp="2025-12-03 12:45:19 +0000 UTC" firstStartedPulling="2025-12-03 12:45:21.749556259 +0000 UTC m=+1910.594517310" lastFinishedPulling="2025-12-03 12:46:03.77798161 +0000 UTC m=+1952.622942661" observedRunningTime="2025-12-03 12:46:04.29850393 +0000 UTC m=+1953.143464981" watchObservedRunningTime="2025-12-03 12:46:04.306203088 +0000 UTC m=+1953.151164139" Dec 03 12:46:04 crc kubenswrapper[4666]: I1203 12:46:04.320308 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-g4zjc" podStartSLOduration=3.226060552 podStartE2EDuration="45.320290459s" podCreationTimestamp="2025-12-03 12:45:19 +0000 UTC" firstStartedPulling="2025-12-03 12:45:21.684893773 +0000 UTC m=+1910.529854824" lastFinishedPulling="2025-12-03 12:46:03.77912368 +0000 UTC m=+1952.624084731" observedRunningTime="2025-12-03 12:46:04.314140323 +0000 UTC m=+1953.159101374" watchObservedRunningTime="2025-12-03 12:46:04.320290459 +0000 UTC m=+1953.165251510" Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.297221 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-96d8bfbbf-pd9x2" event={"ID":"c6fc5a47-ba09-4985-b0d7-26d824dd60e3","Type":"ContainerStarted","Data":"5fe351555ad7edab34130f52ee0c4656661f0e71ee53db9b71f9c8196daa06ea"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.297640 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.297651 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-96d8bfbbf-pd9x2" event={"ID":"c6fc5a47-ba09-4985-b0d7-26d824dd60e3","Type":"ContainerStarted","Data":"d46c20c13c7bc82ebaab671a12cd7ced0c6f923f6fece7e5f51600da53ef8799"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.297660 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-96d8bfbbf-pd9x2" event={"ID":"c6fc5a47-ba09-4985-b0d7-26d824dd60e3","Type":"ContainerStarted","Data":"adb417e81eff003ec284fbe15e3f620a8442c1ae6accfff8a01d99c113ee0059"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.299373 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f668b5844-p5d4j" event={"ID":"0f658bc5-7c9c-4534-8554-cb34af6b5a8b","Type":"ContainerStarted","Data":"520a3b7cf2298fc23b64bd8ad01b04fc869cc401e5a7c32b14bc42746e3fe200"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.299423 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f668b5844-p5d4j" event={"ID":"0f658bc5-7c9c-4534-8554-cb34af6b5a8b","Type":"ContainerStarted","Data":"73b6f33d0e7a54b88144cef186c00a72458817ce9202a1dc023b25d7fd4d7b7c"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.299546 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.300638 4666 generic.go:334] "Generic (PLEG): container finished" podID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerID="fb27b7a6784770705cee843d2964df0f8128dd8f29fd7c520e665abf4eb66912" exitCode=0 Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.300718 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" event={"ID":"2cf51646-afaa-4852-bd35-323fe6ab6c4b","Type":"ContainerDied","Data":"fb27b7a6784770705cee843d2964df0f8128dd8f29fd7c520e665abf4eb66912"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.301972 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrrxv" event={"ID":"9aac4d80-7d0d-4037-a398-6a28ab35d1c9","Type":"ContainerStarted","Data":"cfe8952f4198519cf95562a9c321f3f7f6e2771b7f5feea2e11a1b7409bfd91b"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.304487 4666 generic.go:334] "Generic (PLEG): container finished" podID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerID="77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47" exitCode=0 Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.304518 4666 generic.go:334] "Generic (PLEG): container finished" podID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerID="dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0" exitCode=2 Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.304529 4666 generic.go:334] "Generic (PLEG): container finished" podID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerID="82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582" exitCode=0 Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.304551 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerDied","Data":"77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.304573 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerDied","Data":"dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.304586 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerDied","Data":"82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582"} Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.320407 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-96d8bfbbf-pd9x2" podStartSLOduration=6.320390094 podStartE2EDuration="6.320390094s" podCreationTimestamp="2025-12-03 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:05.31651048 +0000 UTC m=+1954.161471531" watchObservedRunningTime="2025-12-03 12:46:05.320390094 +0000 UTC m=+1954.165351145" Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.345059 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vrrxv" podStartSLOduration=4.308154421 podStartE2EDuration="46.34503366s" podCreationTimestamp="2025-12-03 12:45:19 +0000 UTC" firstStartedPulling="2025-12-03 12:45:21.745037367 +0000 UTC m=+1910.589998418" lastFinishedPulling="2025-12-03 12:46:03.781916606 +0000 UTC m=+1952.626877657" observedRunningTime="2025-12-03 12:46:05.338426732 +0000 UTC m=+1954.183387803" watchObservedRunningTime="2025-12-03 12:46:05.34503366 +0000 UTC m=+1954.189994711" Dec 03 12:46:05 crc kubenswrapper[4666]: I1203 12:46:05.366485 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f668b5844-p5d4j" podStartSLOduration=8.366461909 podStartE2EDuration="8.366461909s" podCreationTimestamp="2025-12-03 12:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:05.361498115 +0000 UTC m=+1954.206459196" watchObservedRunningTime="2025-12-03 12:46:05.366461909 +0000 UTC m=+1954.211422980" Dec 03 12:46:06 crc kubenswrapper[4666]: I1203 12:46:06.324501 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" event={"ID":"2cf51646-afaa-4852-bd35-323fe6ab6c4b","Type":"ContainerStarted","Data":"72e5d7f596e2334fe1abb6425e30b4f7b3862a060c00e6a74bf1f13854435e61"} Dec 03 12:46:06 crc kubenswrapper[4666]: I1203 12:46:06.324789 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:46:06 crc kubenswrapper[4666]: I1203 12:46:06.326806 4666 generic.go:334] "Generic (PLEG): container finished" podID="7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" containerID="2e75119f3f6d7de144f91ea4e85a9b5a8eacbacefc96ae2614ca9ca884f68951" exitCode=0 Dec 03 12:46:06 crc kubenswrapper[4666]: I1203 12:46:06.326930 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g4zjc" event={"ID":"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8","Type":"ContainerDied","Data":"2e75119f3f6d7de144f91ea4e85a9b5a8eacbacefc96ae2614ca9ca884f68951"} Dec 03 12:46:06 crc kubenswrapper[4666]: I1203 12:46:06.347373 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" podStartSLOduration=9.347356586 podStartE2EDuration="9.347356586s" podCreationTimestamp="2025-12-03 12:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:06.34529399 +0000 UTC m=+1955.190255041" watchObservedRunningTime="2025-12-03 12:46:06.347356586 +0000 UTC m=+1955.192317637" Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.711164 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.815641 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-combined-ca-bundle\") pod \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.815765 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-db-sync-config-data\") pod \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.815813 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-kube-api-access-gvg65\") pod \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\" (UID: \"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8\") " Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.826010 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-kube-api-access-gvg65" (OuterVolumeSpecName: "kube-api-access-gvg65") pod "7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" (UID: "7ac4b252-1187-43c2-bfdd-0d48db4ff9e8"). InnerVolumeSpecName "kube-api-access-gvg65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.826354 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" (UID: "7ac4b252-1187-43c2-bfdd-0d48db4ff9e8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.842341 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" (UID: "7ac4b252-1187-43c2-bfdd-0d48db4ff9e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.917594 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-kube-api-access-gvg65\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.917793 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:07 crc kubenswrapper[4666]: I1203 12:46:07.917868 4666 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.310154 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.321743 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dt9m\" (UniqueName: \"kubernetes.io/projected/9ea43de1-1aef-4b87-a8e4-793ea2029687-kube-api-access-7dt9m\") pod \"9ea43de1-1aef-4b87-a8e4-793ea2029687\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.321892 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-log-httpd\") pod \"9ea43de1-1aef-4b87-a8e4-793ea2029687\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.321922 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-combined-ca-bundle\") pod \"9ea43de1-1aef-4b87-a8e4-793ea2029687\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.322003 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-run-httpd\") pod \"9ea43de1-1aef-4b87-a8e4-793ea2029687\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.322670 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9ea43de1-1aef-4b87-a8e4-793ea2029687" (UID: "9ea43de1-1aef-4b87-a8e4-793ea2029687"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.322686 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-sg-core-conf-yaml\") pod \"9ea43de1-1aef-4b87-a8e4-793ea2029687\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.322807 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-config-data\") pod \"9ea43de1-1aef-4b87-a8e4-793ea2029687\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.322831 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-scripts\") pod \"9ea43de1-1aef-4b87-a8e4-793ea2029687\" (UID: \"9ea43de1-1aef-4b87-a8e4-793ea2029687\") " Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.323258 4666 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.323539 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9ea43de1-1aef-4b87-a8e4-793ea2029687" (UID: "9ea43de1-1aef-4b87-a8e4-793ea2029687"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.326522 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-scripts" (OuterVolumeSpecName: "scripts") pod "9ea43de1-1aef-4b87-a8e4-793ea2029687" (UID: "9ea43de1-1aef-4b87-a8e4-793ea2029687"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.326956 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea43de1-1aef-4b87-a8e4-793ea2029687-kube-api-access-7dt9m" (OuterVolumeSpecName: "kube-api-access-7dt9m") pod "9ea43de1-1aef-4b87-a8e4-793ea2029687" (UID: "9ea43de1-1aef-4b87-a8e4-793ea2029687"). InnerVolumeSpecName "kube-api-access-7dt9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.344124 4666 generic.go:334] "Generic (PLEG): container finished" podID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerID="fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010" exitCode=0 Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.344175 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerDied","Data":"fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010"} Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.344224 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ea43de1-1aef-4b87-a8e4-793ea2029687","Type":"ContainerDied","Data":"145df2e61fc81d7a260a88aa4cb98d17fddd073cf9d5d319a194585427425165"} Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.344240 4666 scope.go:117] "RemoveContainer" containerID="77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.344372 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.349124 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g4zjc" event={"ID":"7ac4b252-1187-43c2-bfdd-0d48db4ff9e8","Type":"ContainerDied","Data":"26df360361374ad95012c14c99520dcf3f7325dc5f5d61ce6441b15642fb07fd"} Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.349549 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26df360361374ad95012c14c99520dcf3f7325dc5f5d61ce6441b15642fb07fd" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.349668 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g4zjc" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.363983 4666 scope.go:117] "RemoveContainer" containerID="dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.368215 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9ea43de1-1aef-4b87-a8e4-793ea2029687" (UID: "9ea43de1-1aef-4b87-a8e4-793ea2029687"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.385670 4666 scope.go:117] "RemoveContainer" containerID="fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.401554 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ea43de1-1aef-4b87-a8e4-793ea2029687" (UID: "9ea43de1-1aef-4b87-a8e4-793ea2029687"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.410862 4666 scope.go:117] "RemoveContainer" containerID="82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.423243 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-config-data" (OuterVolumeSpecName: "config-data") pod "9ea43de1-1aef-4b87-a8e4-793ea2029687" (UID: "9ea43de1-1aef-4b87-a8e4-793ea2029687"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.423916 4666 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.423946 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.423958 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.423972 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dt9m\" (UniqueName: \"kubernetes.io/projected/9ea43de1-1aef-4b87-a8e4-793ea2029687-kube-api-access-7dt9m\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.423985 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea43de1-1aef-4b87-a8e4-793ea2029687-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.423993 4666 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ea43de1-1aef-4b87-a8e4-793ea2029687-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.427410 4666 scope.go:117] "RemoveContainer" containerID="77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.427813 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47\": container with ID starting with 77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47 not found: ID does not exist" containerID="77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.427852 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47"} err="failed to get container status \"77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47\": rpc error: code = NotFound desc = could not find container \"77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47\": container with ID starting with 77a84eea99fac4c571cdbf41071fda1c7a32eebbb3f264a3f26a67002cb68f47 not found: ID does not exist" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.427882 4666 scope.go:117] "RemoveContainer" containerID="dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.428199 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0\": container with ID starting with dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0 not found: ID does not exist" containerID="dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.428263 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0"} err="failed to get container status \"dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0\": rpc error: code = NotFound desc = could not find container \"dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0\": container with ID starting with dc356b35d302d21fb64551523b7e5bacceca3d9e1b8f8d64f624dbfea4922fe0 not found: ID does not exist" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.428290 4666 scope.go:117] "RemoveContainer" containerID="fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.428638 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010\": container with ID starting with fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010 not found: ID does not exist" containerID="fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.428665 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010"} err="failed to get container status \"fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010\": rpc error: code = NotFound desc = could not find container \"fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010\": container with ID starting with fd1fddae80f8826987840d93a7fb0976298a93d25980760bb086abfd68ed1010 not found: ID does not exist" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.428682 4666 scope.go:117] "RemoveContainer" containerID="82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.428899 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582\": container with ID starting with 82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582 not found: ID does not exist" containerID="82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.428937 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582"} err="failed to get container status \"82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582\": rpc error: code = NotFound desc = could not find container \"82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582\": container with ID starting with 82e1f20eab3b330ccbdb1882b8897b539861cea36cc370545a7eb98bcaa92582 not found: ID does not exist" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676175 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64566d6d86-kczk4"] Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.676528 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-central-agent" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676547 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-central-agent" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.676567 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="sg-core" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676574 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="sg-core" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.676590 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" containerName="barbican-db-sync" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676595 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" containerName="barbican-db-sync" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.676609 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="proxy-httpd" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676616 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="proxy-httpd" Dec 03 12:46:08 crc kubenswrapper[4666]: E1203 12:46:08.676627 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-notification-agent" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676634 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-notification-agent" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676796 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-notification-agent" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676812 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="ceilometer-central-agent" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676823 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="sg-core" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676835 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" containerName="barbican-db-sync" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.676843 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" containerName="proxy-httpd" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.677653 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.681041 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.681230 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.681627 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lmncm" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.694720 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64566d6d86-kczk4"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.708063 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f7f466d4c-4ps5s"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.711584 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.716011 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.717482 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.744213 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.750013 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f7f466d4c-4ps5s"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.810540 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.812459 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.819584 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.819835 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.832043 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835703 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6c8k\" (UniqueName: \"kubernetes.io/projected/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-kube-api-access-r6c8k\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835737 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0043f11-1613-418f-9974-e88038dd7e5e-logs\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835761 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-run-httpd\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835780 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-combined-ca-bundle\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835796 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835814 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-scripts\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835833 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dde289d-753b-4a00-8863-b671281a0bef-logs\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835852 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-combined-ca-bundle\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835882 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835899 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-config-data\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835935 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-config-data\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835953 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg56s\" (UniqueName: \"kubernetes.io/projected/7dde289d-753b-4a00-8863-b671281a0bef-kube-api-access-kg56s\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.835986 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-log-httpd\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.836006 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-config-data-custom\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.836042 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkg8j\" (UniqueName: \"kubernetes.io/projected/b0043f11-1613-418f-9974-e88038dd7e5e-kube-api-access-wkg8j\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.836072 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-config-data-custom\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.836118 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-config-data\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.895062 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-ztdnp"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.895388 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" podUID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerName="dnsmasq-dns" containerID="cri-o://72e5d7f596e2334fe1abb6425e30b4f7b3862a060c00e6a74bf1f13854435e61" gracePeriod=10 Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.939593 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.946886 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-config-data\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.946983 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-config-data\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.946271 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953381 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg56s\" (UniqueName: \"kubernetes.io/projected/7dde289d-753b-4a00-8863-b671281a0bef-kube-api-access-kg56s\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953590 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-log-httpd\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953636 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-config-data-custom\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953702 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkg8j\" (UniqueName: \"kubernetes.io/projected/b0043f11-1613-418f-9974-e88038dd7e5e-kube-api-access-wkg8j\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953772 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-config-data-custom\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953839 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-config-data\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953927 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6c8k\" (UniqueName: \"kubernetes.io/projected/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-kube-api-access-r6c8k\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.953969 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0043f11-1613-418f-9974-e88038dd7e5e-logs\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.954017 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-run-httpd\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.954062 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-combined-ca-bundle\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.954114 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.954151 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-scripts\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.954176 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dde289d-753b-4a00-8863-b671281a0bef-logs\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.954209 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-combined-ca-bundle\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.955169 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-log-httpd\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.957214 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-config-data\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.957537 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0043f11-1613-418f-9974-e88038dd7e5e-logs\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.959267 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dde289d-753b-4a00-8863-b671281a0bef-logs\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.959707 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-run-httpd\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.961973 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-vw4kp"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.962264 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-config-data-custom\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.963533 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-scripts\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.964457 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-combined-ca-bundle\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.968573 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.973061 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-config-data-custom\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.976089 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0043f11-1613-418f-9974-e88038dd7e5e-config-data\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.976296 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg56s\" (UniqueName: \"kubernetes.io/projected/7dde289d-753b-4a00-8863-b671281a0bef-kube-api-access-kg56s\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.978078 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dde289d-753b-4a00-8863-b671281a0bef-combined-ca-bundle\") pod \"barbican-worker-5f7f466d4c-4ps5s\" (UID: \"7dde289d-753b-4a00-8863-b671281a0bef\") " pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.980310 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.981759 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-vw4kp"] Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.983647 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6c8k\" (UniqueName: \"kubernetes.io/projected/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-kube-api-access-r6c8k\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.983664 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-config-data\") pod \"ceilometer-0\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " pod="openstack/ceilometer-0" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.989087 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkg8j\" (UniqueName: \"kubernetes.io/projected/b0043f11-1613-418f-9974-e88038dd7e5e-kube-api-access-wkg8j\") pod \"barbican-keystone-listener-64566d6d86-kczk4\" (UID: \"b0043f11-1613-418f-9974-e88038dd7e5e\") " pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:08 crc kubenswrapper[4666]: I1203 12:46:08.998476 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-845b8c48c4-mw2nj"] Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.000138 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.003550 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.009683 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.010562 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-845b8c48c4-mw2nj"] Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.040505 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f7f466d4c-4ps5s" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.057554 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data-custom\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.057624 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.057758 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-dns-svc\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.057820 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-combined-ca-bundle\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.057891 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.057963 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9nmm\" (UniqueName: \"kubernetes.io/projected/02046b08-d2b5-444f-901f-6ec4f1195631-kube-api-access-d9nmm\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.058015 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.058060 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqr5k\" (UniqueName: \"kubernetes.io/projected/78ed53b7-3b15-46aa-a024-8f1b69d62469-kube-api-access-bqr5k\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.058177 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-config\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.058297 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02046b08-d2b5-444f-901f-6ec4f1195631-logs\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.151994 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165439 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-dns-svc\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165509 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-combined-ca-bundle\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165545 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165564 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9nmm\" (UniqueName: \"kubernetes.io/projected/02046b08-d2b5-444f-901f-6ec4f1195631-kube-api-access-d9nmm\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165582 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165598 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqr5k\" (UniqueName: \"kubernetes.io/projected/78ed53b7-3b15-46aa-a024-8f1b69d62469-kube-api-access-bqr5k\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165657 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-config\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165690 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02046b08-d2b5-444f-901f-6ec4f1195631-logs\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165749 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data-custom\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.165770 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.166659 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-dns-svc\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.167110 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02046b08-d2b5-444f-901f-6ec4f1195631-logs\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.167192 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.168030 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-config\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.168405 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.175900 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-combined-ca-bundle\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.176781 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data-custom\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.186009 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.193256 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqr5k\" (UniqueName: \"kubernetes.io/projected/78ed53b7-3b15-46aa-a024-8f1b69d62469-kube-api-access-bqr5k\") pod \"dnsmasq-dns-6bb684768f-vw4kp\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.196126 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9nmm\" (UniqueName: \"kubernetes.io/projected/02046b08-d2b5-444f-901f-6ec4f1195631-kube-api-access-d9nmm\") pod \"barbican-api-845b8c48c4-mw2nj\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.372510 4666 generic.go:334] "Generic (PLEG): container finished" podID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerID="72e5d7f596e2334fe1abb6425e30b4f7b3862a060c00e6a74bf1f13854435e61" exitCode=0 Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.372579 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" event={"ID":"2cf51646-afaa-4852-bd35-323fe6ab6c4b","Type":"ContainerDied","Data":"72e5d7f596e2334fe1abb6425e30b4f7b3862a060c00e6a74bf1f13854435e61"} Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.437392 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea43de1-1aef-4b87-a8e4-793ea2029687" path="/var/lib/kubelet/pods/9ea43de1-1aef-4b87-a8e4-793ea2029687/volumes" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.459691 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:09 crc kubenswrapper[4666]: W1203 12:46:09.463519 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0043f11_1613_418f_9974_e88038dd7e5e.slice/crio-730648c8c6d492ce019d1b60fdca351a845637e0b41796185a28f2ae5f5de7cd WatchSource:0}: Error finding container 730648c8c6d492ce019d1b60fdca351a845637e0b41796185a28f2ae5f5de7cd: Status 404 returned error can't find the container with id 730648c8c6d492ce019d1b60fdca351a845637e0b41796185a28f2ae5f5de7cd Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.463858 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64566d6d86-kczk4"] Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.470537 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.564273 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f7f466d4c-4ps5s"] Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.703588 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:09 crc kubenswrapper[4666]: W1203 12:46:09.705740 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cff28e2_a79e_4a5f_8adc_a72a3e03f8ac.slice/crio-8fa1fe75d6deab6ab597248e5602663c568f493bed0c13c2dc4c158b2c86ac3c WatchSource:0}: Error finding container 8fa1fe75d6deab6ab597248e5602663c568f493bed0c13c2dc4c158b2c86ac3c: Status 404 returned error can't find the container with id 8fa1fe75d6deab6ab597248e5602663c568f493bed0c13c2dc4c158b2c86ac3c Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.933756 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-vw4kp"] Dec 03 12:46:09 crc kubenswrapper[4666]: W1203 12:46:09.937062 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78ed53b7_3b15_46aa_a024_8f1b69d62469.slice/crio-a660b7757ea5002394dbc94ed0286456ef847cb2c941aa93a71d31a5193ad98b WatchSource:0}: Error finding container a660b7757ea5002394dbc94ed0286456ef847cb2c941aa93a71d31a5193ad98b: Status 404 returned error can't find the container with id a660b7757ea5002394dbc94ed0286456ef847cb2c941aa93a71d31a5193ad98b Dec 03 12:46:09 crc kubenswrapper[4666]: I1203 12:46:09.986821 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-845b8c48c4-mw2nj"] Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.178997 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.284839 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-sb\") pod \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.284893 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7t5f\" (UniqueName: \"kubernetes.io/projected/2cf51646-afaa-4852-bd35-323fe6ab6c4b-kube-api-access-k7t5f\") pod \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.284947 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-nb\") pod \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.284972 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-config\") pod \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.284998 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-dns-svc\") pod \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\" (UID: \"2cf51646-afaa-4852-bd35-323fe6ab6c4b\") " Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.291364 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf51646-afaa-4852-bd35-323fe6ab6c4b-kube-api-access-k7t5f" (OuterVolumeSpecName: "kube-api-access-k7t5f") pod "2cf51646-afaa-4852-bd35-323fe6ab6c4b" (UID: "2cf51646-afaa-4852-bd35-323fe6ab6c4b"). InnerVolumeSpecName "kube-api-access-k7t5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.383160 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" event={"ID":"78ed53b7-3b15-46aa-a024-8f1b69d62469","Type":"ContainerStarted","Data":"c35483ead349fa2e891c1e6e276cae2be8079e3e1d56d8bc57d1a5fa5e346e17"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.383474 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" event={"ID":"78ed53b7-3b15-46aa-a024-8f1b69d62469","Type":"ContainerStarted","Data":"a660b7757ea5002394dbc94ed0286456ef847cb2c941aa93a71d31a5193ad98b"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.384761 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerStarted","Data":"8fa1fe75d6deab6ab597248e5602663c568f493bed0c13c2dc4c158b2c86ac3c"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.386747 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7t5f\" (UniqueName: \"kubernetes.io/projected/2cf51646-afaa-4852-bd35-323fe6ab6c4b-kube-api-access-k7t5f\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.389561 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" event={"ID":"b0043f11-1613-418f-9974-e88038dd7e5e","Type":"ContainerStarted","Data":"730648c8c6d492ce019d1b60fdca351a845637e0b41796185a28f2ae5f5de7cd"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.390939 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f7f466d4c-4ps5s" event={"ID":"7dde289d-753b-4a00-8863-b671281a0bef","Type":"ContainerStarted","Data":"fcad28bdd56ff060c6efef93d2096f127ddf9bc29694ae7f1e1baa51d1547d89"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.393038 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2cf51646-afaa-4852-bd35-323fe6ab6c4b" (UID: "2cf51646-afaa-4852-bd35-323fe6ab6c4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.393800 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.393796 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-ztdnp" event={"ID":"2cf51646-afaa-4852-bd35-323fe6ab6c4b","Type":"ContainerDied","Data":"62442867cbf455bda98ab82f7a74efbaae09e08e294dc322c983a0833c970385"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.393864 4666 scope.go:117] "RemoveContainer" containerID="72e5d7f596e2334fe1abb6425e30b4f7b3862a060c00e6a74bf1f13854435e61" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.395957 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-845b8c48c4-mw2nj" event={"ID":"02046b08-d2b5-444f-901f-6ec4f1195631","Type":"ContainerStarted","Data":"9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.395982 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-845b8c48c4-mw2nj" event={"ID":"02046b08-d2b5-444f-901f-6ec4f1195631","Type":"ContainerStarted","Data":"b62ebdb11ab717354a41edeff089dcf2f34e125a43357293e381a2ada5b9860c"} Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.397811 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2cf51646-afaa-4852-bd35-323fe6ab6c4b" (UID: "2cf51646-afaa-4852-bd35-323fe6ab6c4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.400386 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2cf51646-afaa-4852-bd35-323fe6ab6c4b" (UID: "2cf51646-afaa-4852-bd35-323fe6ab6c4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.403699 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-config" (OuterVolumeSpecName: "config") pod "2cf51646-afaa-4852-bd35-323fe6ab6c4b" (UID: "2cf51646-afaa-4852-bd35-323fe6ab6c4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.465363 4666 scope.go:117] "RemoveContainer" containerID="fb27b7a6784770705cee843d2964df0f8128dd8f29fd7c520e665abf4eb66912" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.488955 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.489000 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.489015 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.489024 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cf51646-afaa-4852-bd35-323fe6ab6c4b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.726531 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-ztdnp"] Dec 03 12:46:10 crc kubenswrapper[4666]: I1203 12:46:10.734569 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-ztdnp"] Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.407280 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-845b8c48c4-mw2nj" event={"ID":"02046b08-d2b5-444f-901f-6ec4f1195631","Type":"ContainerStarted","Data":"1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4"} Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.407544 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.410919 4666 generic.go:334] "Generic (PLEG): container finished" podID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerID="c35483ead349fa2e891c1e6e276cae2be8079e3e1d56d8bc57d1a5fa5e346e17" exitCode=0 Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.410968 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" event={"ID":"78ed53b7-3b15-46aa-a024-8f1b69d62469","Type":"ContainerDied","Data":"c35483ead349fa2e891c1e6e276cae2be8079e3e1d56d8bc57d1a5fa5e346e17"} Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.431887 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-845b8c48c4-mw2nj" podStartSLOduration=3.431867572 podStartE2EDuration="3.431867572s" podCreationTimestamp="2025-12-03 12:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:11.425227972 +0000 UTC m=+1960.270189043" watchObservedRunningTime="2025-12-03 12:46:11.431867572 +0000 UTC m=+1960.276828623" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.442447 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" path="/var/lib/kubelet/pods/2cf51646-afaa-4852-bd35-323fe6ab6c4b/volumes" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.892999 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d6d7b74fd-t9f48"] Dec 03 12:46:11 crc kubenswrapper[4666]: E1203 12:46:11.893423 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerName="dnsmasq-dns" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.893440 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerName="dnsmasq-dns" Dec 03 12:46:11 crc kubenswrapper[4666]: E1203 12:46:11.893455 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerName="init" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.893463 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerName="init" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.893684 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf51646-afaa-4852-bd35-323fe6ab6c4b" containerName="dnsmasq-dns" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.894760 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.897318 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.908855 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.912436 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4t4c\" (UniqueName: \"kubernetes.io/projected/e1a5016b-ec1e-485e-bedf-ced8377f2aae-kube-api-access-h4t4c\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.912532 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-config-data-custom\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.912628 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-internal-tls-certs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.912682 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-public-tls-certs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.912738 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-config-data\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.912770 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1a5016b-ec1e-485e-bedf-ced8377f2aae-logs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.912810 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-combined-ca-bundle\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:11 crc kubenswrapper[4666]: I1203 12:46:11.917966 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d6d7b74fd-t9f48"] Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.014582 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-public-tls-certs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.014953 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-config-data\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.014997 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1a5016b-ec1e-485e-bedf-ced8377f2aae-logs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.015035 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-combined-ca-bundle\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.015069 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4t4c\" (UniqueName: \"kubernetes.io/projected/e1a5016b-ec1e-485e-bedf-ced8377f2aae-kube-api-access-h4t4c\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.015152 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-config-data-custom\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.015238 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-internal-tls-certs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.015804 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1a5016b-ec1e-485e-bedf-ced8377f2aae-logs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.019231 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-public-tls-certs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.019747 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-internal-tls-certs\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.020115 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-config-data-custom\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.020939 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-config-data\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.022167 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a5016b-ec1e-485e-bedf-ced8377f2aae-combined-ca-bundle\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.039178 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4t4c\" (UniqueName: \"kubernetes.io/projected/e1a5016b-ec1e-485e-bedf-ced8377f2aae-kube-api-access-h4t4c\") pod \"barbican-api-7d6d7b74fd-t9f48\" (UID: \"e1a5016b-ec1e-485e-bedf-ced8377f2aae\") " pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.226310 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.427241 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" event={"ID":"78ed53b7-3b15-46aa-a024-8f1b69d62469","Type":"ContainerStarted","Data":"4b67c7f22d7011471b83ac6b8adee6d1f1498093b2b80b638cdb7921d1495935"} Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.427314 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.427377 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:12 crc kubenswrapper[4666]: I1203 12:46:12.450795 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" podStartSLOduration=4.450775055 podStartE2EDuration="4.450775055s" podCreationTimestamp="2025-12-03 12:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:12.447781004 +0000 UTC m=+1961.292742075" watchObservedRunningTime="2025-12-03 12:46:12.450775055 +0000 UTC m=+1961.295736106" Dec 03 12:46:14 crc kubenswrapper[4666]: I1203 12:46:14.424114 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:46:14 crc kubenswrapper[4666]: E1203 12:46:14.424897 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:46:14 crc kubenswrapper[4666]: W1203 12:46:14.580422 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a5016b_ec1e_485e_bedf_ced8377f2aae.slice/crio-0066fabec7f6fa5965c4f8b2702f03f34c49dc561c5492f5e3ff0f5ad06a2871 WatchSource:0}: Error finding container 0066fabec7f6fa5965c4f8b2702f03f34c49dc561c5492f5e3ff0f5ad06a2871: Status 404 returned error can't find the container with id 0066fabec7f6fa5965c4f8b2702f03f34c49dc561c5492f5e3ff0f5ad06a2871 Dec 03 12:46:14 crc kubenswrapper[4666]: I1203 12:46:14.585061 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d6d7b74fd-t9f48"] Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.458184 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" event={"ID":"b0043f11-1613-418f-9974-e88038dd7e5e","Type":"ContainerStarted","Data":"5da027c8bdde161c4c1509f2ba4be4b2976840ffff9374e3017bbcff372a2076"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.458499 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" event={"ID":"b0043f11-1613-418f-9974-e88038dd7e5e","Type":"ContainerStarted","Data":"4aa8179d81bb28b3bbade37a2b47f1553bc138dd67d734549d9a8dadbc13cf44"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.460589 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d6d7b74fd-t9f48" event={"ID":"e1a5016b-ec1e-485e-bedf-ced8377f2aae","Type":"ContainerStarted","Data":"6bedfcb29be9d4eb7cc22a7198222c34d117d23e47af0e6fff561a5393386abd"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.460621 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d6d7b74fd-t9f48" event={"ID":"e1a5016b-ec1e-485e-bedf-ced8377f2aae","Type":"ContainerStarted","Data":"b453a036e6708768916f9a5290188669d2533ddd7b42b6250e918d666e3ab08d"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.460632 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d6d7b74fd-t9f48" event={"ID":"e1a5016b-ec1e-485e-bedf-ced8377f2aae","Type":"ContainerStarted","Data":"0066fabec7f6fa5965c4f8b2702f03f34c49dc561c5492f5e3ff0f5ad06a2871"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.461278 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.461306 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.463312 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f7f466d4c-4ps5s" event={"ID":"7dde289d-753b-4a00-8863-b671281a0bef","Type":"ContainerStarted","Data":"0a1997f5d04072b62e980da65f19bfd26d957b545812b448a16c4fe1a4b5e80a"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.463350 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f7f466d4c-4ps5s" event={"ID":"7dde289d-753b-4a00-8863-b671281a0bef","Type":"ContainerStarted","Data":"0bc64dc8cd49015ebe37c3a2cc72c9ee9cf2c5da748ed09c02e17c54a6f8b1b6"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.467851 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerStarted","Data":"51390f023f6a5e2de380856a2fa0f240602ed0a97d64502e92b41bf31a13357d"} Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.482517 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64566d6d86-kczk4" podStartSLOduration=2.752508701 podStartE2EDuration="7.48249335s" podCreationTimestamp="2025-12-03 12:46:08 +0000 UTC" firstStartedPulling="2025-12-03 12:46:09.472610297 +0000 UTC m=+1958.317571348" lastFinishedPulling="2025-12-03 12:46:14.202594946 +0000 UTC m=+1963.047555997" observedRunningTime="2025-12-03 12:46:15.475498571 +0000 UTC m=+1964.320459622" watchObservedRunningTime="2025-12-03 12:46:15.48249335 +0000 UTC m=+1964.327454401" Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.526943 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f7f466d4c-4ps5s" podStartSLOduration=2.907798846 podStartE2EDuration="7.52692174s" podCreationTimestamp="2025-12-03 12:46:08 +0000 UTC" firstStartedPulling="2025-12-03 12:46:09.582955258 +0000 UTC m=+1958.427916309" lastFinishedPulling="2025-12-03 12:46:14.202078152 +0000 UTC m=+1963.047039203" observedRunningTime="2025-12-03 12:46:15.502833929 +0000 UTC m=+1964.347795000" watchObservedRunningTime="2025-12-03 12:46:15.52692174 +0000 UTC m=+1964.371882791" Dec 03 12:46:15 crc kubenswrapper[4666]: I1203 12:46:15.528021 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d6d7b74fd-t9f48" podStartSLOduration=4.528010979 podStartE2EDuration="4.528010979s" podCreationTimestamp="2025-12-03 12:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:15.522428929 +0000 UTC m=+1964.367389980" watchObservedRunningTime="2025-12-03 12:46:15.528010979 +0000 UTC m=+1964.372972040" Dec 03 12:46:17 crc kubenswrapper[4666]: I1203 12:46:17.490655 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerStarted","Data":"ca8ebbbf5482d7074e56ea0dcbb4d06bc5bd74e098ee400d3c2582bb272b8bee"} Dec 03 12:46:18 crc kubenswrapper[4666]: I1203 12:46:18.002487 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-845b8c48c4-mw2nj" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:46:19 crc kubenswrapper[4666]: I1203 12:46:19.462418 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:19 crc kubenswrapper[4666]: I1203 12:46:19.560349 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-pp2z9"] Dec 03 12:46:19 crc kubenswrapper[4666]: I1203 12:46:19.560621 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerName="dnsmasq-dns" containerID="cri-o://7aa4033ec778a65011d4cf5e80f67cca6cdc2f9afab2ef8784445ba07595c1e8" gracePeriod=10 Dec 03 12:46:20 crc kubenswrapper[4666]: I1203 12:46:20.077333 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 03 12:46:20 crc kubenswrapper[4666]: I1203 12:46:20.527944 4666 generic.go:334] "Generic (PLEG): container finished" podID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerID="7aa4033ec778a65011d4cf5e80f67cca6cdc2f9afab2ef8784445ba07595c1e8" exitCode=0 Dec 03 12:46:20 crc kubenswrapper[4666]: I1203 12:46:20.528159 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" event={"ID":"0752436b-39dd-4cae-87a4-1b51901ad71f","Type":"ContainerDied","Data":"7aa4033ec778a65011d4cf5e80f67cca6cdc2f9afab2ef8784445ba07595c1e8"} Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.059533 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.298670 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.340301 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.392066 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q642v\" (UniqueName: \"kubernetes.io/projected/0752436b-39dd-4cae-87a4-1b51901ad71f-kube-api-access-q642v\") pod \"0752436b-39dd-4cae-87a4-1b51901ad71f\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.392147 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-config\") pod \"0752436b-39dd-4cae-87a4-1b51901ad71f\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.392216 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-sb\") pod \"0752436b-39dd-4cae-87a4-1b51901ad71f\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.392233 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-nb\") pod \"0752436b-39dd-4cae-87a4-1b51901ad71f\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.392304 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-dns-svc\") pod \"0752436b-39dd-4cae-87a4-1b51901ad71f\" (UID: \"0752436b-39dd-4cae-87a4-1b51901ad71f\") " Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.402359 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0752436b-39dd-4cae-87a4-1b51901ad71f-kube-api-access-q642v" (OuterVolumeSpecName: "kube-api-access-q642v") pod "0752436b-39dd-4cae-87a4-1b51901ad71f" (UID: "0752436b-39dd-4cae-87a4-1b51901ad71f"). InnerVolumeSpecName "kube-api-access-q642v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.488631 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0752436b-39dd-4cae-87a4-1b51901ad71f" (UID: "0752436b-39dd-4cae-87a4-1b51901ad71f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.488846 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0752436b-39dd-4cae-87a4-1b51901ad71f" (UID: "0752436b-39dd-4cae-87a4-1b51901ad71f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.503414 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q642v\" (UniqueName: \"kubernetes.io/projected/0752436b-39dd-4cae-87a4-1b51901ad71f-kube-api-access-q642v\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.503463 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.503476 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.525512 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-config" (OuterVolumeSpecName: "config") pod "0752436b-39dd-4cae-87a4-1b51901ad71f" (UID: "0752436b-39dd-4cae-87a4-1b51901ad71f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.527554 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0752436b-39dd-4cae-87a4-1b51901ad71f" (UID: "0752436b-39dd-4cae-87a4-1b51901ad71f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.551328 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerStarted","Data":"ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80"} Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.552681 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" event={"ID":"0752436b-39dd-4cae-87a4-1b51901ad71f","Type":"ContainerDied","Data":"9b7c5f238870acef15e990e94dcc7e143a2827531470c3179a1f5f22c3514eb0"} Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.552757 4666 scope.go:117] "RemoveContainer" containerID="7aa4033ec778a65011d4cf5e80f67cca6cdc2f9afab2ef8784445ba07595c1e8" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.552710 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-pp2z9" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.578914 4666 scope.go:117] "RemoveContainer" containerID="4568f25f6866e87575936ec8181dc91d542f257b2853ef2674c5147713585ebd" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.588371 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-pp2z9"] Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.595798 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-pp2z9"] Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.604661 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:21 crc kubenswrapper[4666]: I1203 12:46:21.604691 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0752436b-39dd-4cae-87a4-1b51901ad71f-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:22 crc kubenswrapper[4666]: I1203 12:46:22.563848 4666 generic.go:334] "Generic (PLEG): container finished" podID="9aac4d80-7d0d-4037-a398-6a28ab35d1c9" containerID="cfe8952f4198519cf95562a9c321f3f7f6e2771b7f5feea2e11a1b7409bfd91b" exitCode=0 Dec 03 12:46:22 crc kubenswrapper[4666]: I1203 12:46:22.563955 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrrxv" event={"ID":"9aac4d80-7d0d-4037-a398-6a28ab35d1c9","Type":"ContainerDied","Data":"cfe8952f4198519cf95562a9c321f3f7f6e2771b7f5feea2e11a1b7409bfd91b"} Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.432965 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" path="/var/lib/kubelet/pods/0752436b-39dd-4cae-87a4-1b51901ad71f/volumes" Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.578233 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerStarted","Data":"a279afa3c2b9b60a570bc7d6129224f6bf07c1a9935bbf050498966299ae9815"} Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.578594 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.604149 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.520285948 podStartE2EDuration="15.604129106s" podCreationTimestamp="2025-12-03 12:46:08 +0000 UTC" firstStartedPulling="2025-12-03 12:46:09.709528287 +0000 UTC m=+1958.554489348" lastFinishedPulling="2025-12-03 12:46:22.793371455 +0000 UTC m=+1971.638332506" observedRunningTime="2025-12-03 12:46:23.598564426 +0000 UTC m=+1972.443525477" watchObservedRunningTime="2025-12-03 12:46:23.604129106 +0000 UTC m=+1972.449090157" Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.806891 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.831578 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d6d7b74fd-t9f48" Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.901254 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-845b8c48c4-mw2nj"] Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.901530 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-845b8c48c4-mw2nj" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api-log" containerID="cri-o://9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f" gracePeriod=30 Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.901657 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-845b8c48c4-mw2nj" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api" containerID="cri-o://1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4" gracePeriod=30 Dec 03 12:46:23 crc kubenswrapper[4666]: I1203 12:46:23.949600 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.046917 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-scripts\") pod \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.046997 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4l2f\" (UniqueName: \"kubernetes.io/projected/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-kube-api-access-c4l2f\") pod \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.047076 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-db-sync-config-data\") pod \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.047152 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-etc-machine-id\") pod \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.047205 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-combined-ca-bundle\") pod \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.047251 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-config-data\") pod \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\" (UID: \"9aac4d80-7d0d-4037-a398-6a28ab35d1c9\") " Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.047330 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9aac4d80-7d0d-4037-a398-6a28ab35d1c9" (UID: "9aac4d80-7d0d-4037-a398-6a28ab35d1c9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.047651 4666 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.052006 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9aac4d80-7d0d-4037-a398-6a28ab35d1c9" (UID: "9aac4d80-7d0d-4037-a398-6a28ab35d1c9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.054209 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-kube-api-access-c4l2f" (OuterVolumeSpecName: "kube-api-access-c4l2f") pod "9aac4d80-7d0d-4037-a398-6a28ab35d1c9" (UID: "9aac4d80-7d0d-4037-a398-6a28ab35d1c9"). InnerVolumeSpecName "kube-api-access-c4l2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.054740 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-scripts" (OuterVolumeSpecName: "scripts") pod "9aac4d80-7d0d-4037-a398-6a28ab35d1c9" (UID: "9aac4d80-7d0d-4037-a398-6a28ab35d1c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.075907 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9aac4d80-7d0d-4037-a398-6a28ab35d1c9" (UID: "9aac4d80-7d0d-4037-a398-6a28ab35d1c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.107148 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-config-data" (OuterVolumeSpecName: "config-data") pod "9aac4d80-7d0d-4037-a398-6a28ab35d1c9" (UID: "9aac4d80-7d0d-4037-a398-6a28ab35d1c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.148834 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.148864 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4l2f\" (UniqueName: \"kubernetes.io/projected/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-kube-api-access-c4l2f\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.148879 4666 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.148887 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.148895 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aac4d80-7d0d-4037-a398-6a28ab35d1c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.586870 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vrrxv" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.586866 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vrrxv" event={"ID":"9aac4d80-7d0d-4037-a398-6a28ab35d1c9","Type":"ContainerDied","Data":"999ece3b77aa113ab5effe7978ad4f4730d08d33ed404e61b436ff478d1db290"} Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.588118 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="999ece3b77aa113ab5effe7978ad4f4730d08d33ed404e61b436ff478d1db290" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.589031 4666 generic.go:334] "Generic (PLEG): container finished" podID="02046b08-d2b5-444f-901f-6ec4f1195631" containerID="9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f" exitCode=143 Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.589104 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-845b8c48c4-mw2nj" event={"ID":"02046b08-d2b5-444f-901f-6ec4f1195631","Type":"ContainerDied","Data":"9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f"} Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.907926 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:24 crc kubenswrapper[4666]: E1203 12:46:24.908329 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aac4d80-7d0d-4037-a398-6a28ab35d1c9" containerName="cinder-db-sync" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.908347 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aac4d80-7d0d-4037-a398-6a28ab35d1c9" containerName="cinder-db-sync" Dec 03 12:46:24 crc kubenswrapper[4666]: E1203 12:46:24.908360 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerName="dnsmasq-dns" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.908366 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerName="dnsmasq-dns" Dec 03 12:46:24 crc kubenswrapper[4666]: E1203 12:46:24.908378 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerName="init" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.908384 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerName="init" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.908555 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aac4d80-7d0d-4037-a398-6a28ab35d1c9" containerName="cinder-db-sync" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.908572 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="0752436b-39dd-4cae-87a4-1b51901ad71f" containerName="dnsmasq-dns" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.935045 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.935162 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.940902 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mlszd" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.941136 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.941292 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.941566 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.984550 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffzgb\" (UniqueName: \"kubernetes.io/projected/b052c5d6-2493-4e71-9df6-3b275aafaf2e-kube-api-access-ffzgb\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.984598 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b052c5d6-2493-4e71-9df6-3b275aafaf2e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.984630 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-scripts\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.984659 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.984728 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:24 crc kubenswrapper[4666]: I1203 12:46:24.984758 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.003565 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-76cj6"] Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.007478 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.053631 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-76cj6"] Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089614 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffzgb\" (UniqueName: \"kubernetes.io/projected/b052c5d6-2493-4e71-9df6-3b275aafaf2e-kube-api-access-ffzgb\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089660 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089687 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b052c5d6-2493-4e71-9df6-3b275aafaf2e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089724 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-scripts\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089750 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089787 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089834 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dk8f\" (UniqueName: \"kubernetes.io/projected/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-kube-api-access-4dk8f\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089869 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089903 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089925 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.089942 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-config\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.092240 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.093825 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.095187 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.095500 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b052c5d6-2493-4e71-9df6-3b275aafaf2e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.100122 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.104528 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.105952 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.116501 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-scripts\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.123878 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffzgb\" (UniqueName: \"kubernetes.io/projected/b052c5d6-2493-4e71-9df6-3b275aafaf2e-kube-api-access-ffzgb\") pod \"cinder-scheduler-0\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.139985 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.192923 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59da55ae-1495-452b-93c3-11c21f464d3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.192970 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.192993 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-config\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193032 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193060 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193082 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193309 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193336 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-scripts\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193381 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6gnp\" (UniqueName: \"kubernetes.io/projected/59da55ae-1495-452b-93c3-11c21f464d3c-kube-api-access-h6gnp\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193403 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193422 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59da55ae-1495-452b-93c3-11c21f464d3c-logs\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.193454 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dk8f\" (UniqueName: \"kubernetes.io/projected/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-kube-api-access-4dk8f\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.194354 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-config\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.194902 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.195059 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.195296 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.215677 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dk8f\" (UniqueName: \"kubernetes.io/projected/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-kube-api-access-4dk8f\") pod \"dnsmasq-dns-6d97fcdd8f-76cj6\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.262273 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.295374 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.295746 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.295790 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.295827 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-scripts\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.295886 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6gnp\" (UniqueName: \"kubernetes.io/projected/59da55ae-1495-452b-93c3-11c21f464d3c-kube-api-access-h6gnp\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.295925 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59da55ae-1495-452b-93c3-11c21f464d3c-logs\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.296014 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59da55ae-1495-452b-93c3-11c21f464d3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.296151 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59da55ae-1495-452b-93c3-11c21f464d3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.298280 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59da55ae-1495-452b-93c3-11c21f464d3c-logs\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.299514 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.303760 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.303785 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-scripts\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.310519 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.316295 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6gnp\" (UniqueName: \"kubernetes.io/projected/59da55ae-1495-452b-93c3-11c21f464d3c-kube-api-access-h6gnp\") pod \"cinder-api-0\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.349357 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.430984 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:46:25 crc kubenswrapper[4666]: E1203 12:46:25.431388 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.495514 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.767617 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:25 crc kubenswrapper[4666]: W1203 12:46:25.886487 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdbf81e4_0170_4cd0_bc4b_6a1194bc6c6b.slice/crio-4ab63e07113c48a9a09e2b5a263ae0c8bf04e861b10eaa8df1a0c0b137a3afef WatchSource:0}: Error finding container 4ab63e07113c48a9a09e2b5a263ae0c8bf04e861b10eaa8df1a0c0b137a3afef: Status 404 returned error can't find the container with id 4ab63e07113c48a9a09e2b5a263ae0c8bf04e861b10eaa8df1a0c0b137a3afef Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.886774 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-76cj6"] Dec 03 12:46:25 crc kubenswrapper[4666]: I1203 12:46:25.979391 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.096231 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.350245 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9676dbdb4-pcj6f" Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.630700 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"59da55ae-1495-452b-93c3-11c21f464d3c","Type":"ContainerStarted","Data":"42ed243ea62cf491ae690840e0200f272c0b585f2ca9402401429b4b50d0517f"} Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.632289 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b052c5d6-2493-4e71-9df6-3b275aafaf2e","Type":"ContainerStarted","Data":"ea9da3934edad826b06a87821f4ad41f6eb67b350f8e43a4a6d5f2f9d8f50777"} Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.635220 4666 generic.go:334] "Generic (PLEG): container finished" podID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerID="36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc" exitCode=0 Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.636669 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" event={"ID":"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b","Type":"ContainerDied","Data":"36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc"} Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.636702 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" event={"ID":"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b","Type":"ContainerStarted","Data":"4ab63e07113c48a9a09e2b5a263ae0c8bf04e861b10eaa8df1a0c0b137a3afef"} Dec 03 12:46:26 crc kubenswrapper[4666]: I1203 12:46:26.816367 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5fdb97596b-722zc" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.036479 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-845b8c48c4-mw2nj" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:54428->10.217.0.144:9311: read: connection reset by peer" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.036499 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-845b8c48c4-mw2nj" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:54442->10.217.0.144:9311: read: connection reset by peer" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.452987 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.548648 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data-custom\") pod \"02046b08-d2b5-444f-901f-6ec4f1195631\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.548681 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-combined-ca-bundle\") pod \"02046b08-d2b5-444f-901f-6ec4f1195631\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.548749 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9nmm\" (UniqueName: \"kubernetes.io/projected/02046b08-d2b5-444f-901f-6ec4f1195631-kube-api-access-d9nmm\") pod \"02046b08-d2b5-444f-901f-6ec4f1195631\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.548794 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02046b08-d2b5-444f-901f-6ec4f1195631-logs\") pod \"02046b08-d2b5-444f-901f-6ec4f1195631\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.548925 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data\") pod \"02046b08-d2b5-444f-901f-6ec4f1195631\" (UID: \"02046b08-d2b5-444f-901f-6ec4f1195631\") " Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.551403 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02046b08-d2b5-444f-901f-6ec4f1195631-logs" (OuterVolumeSpecName: "logs") pod "02046b08-d2b5-444f-901f-6ec4f1195631" (UID: "02046b08-d2b5-444f-901f-6ec4f1195631"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.554849 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02046b08-d2b5-444f-901f-6ec4f1195631-kube-api-access-d9nmm" (OuterVolumeSpecName: "kube-api-access-d9nmm") pod "02046b08-d2b5-444f-901f-6ec4f1195631" (UID: "02046b08-d2b5-444f-901f-6ec4f1195631"). InnerVolumeSpecName "kube-api-access-d9nmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.564994 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "02046b08-d2b5-444f-901f-6ec4f1195631" (UID: "02046b08-d2b5-444f-901f-6ec4f1195631"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.597243 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02046b08-d2b5-444f-901f-6ec4f1195631" (UID: "02046b08-d2b5-444f-901f-6ec4f1195631"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.630964 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data" (OuterVolumeSpecName: "config-data") pod "02046b08-d2b5-444f-901f-6ec4f1195631" (UID: "02046b08-d2b5-444f-901f-6ec4f1195631"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.650973 4666 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.651032 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.651047 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9nmm\" (UniqueName: \"kubernetes.io/projected/02046b08-d2b5-444f-901f-6ec4f1195631-kube-api-access-d9nmm\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.651060 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02046b08-d2b5-444f-901f-6ec4f1195631-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.651134 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02046b08-d2b5-444f-901f-6ec4f1195631-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.661909 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"59da55ae-1495-452b-93c3-11c21f464d3c","Type":"ContainerStarted","Data":"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2"} Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.664287 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b052c5d6-2493-4e71-9df6-3b275aafaf2e","Type":"ContainerStarted","Data":"4ec8c3c0bd75bd8eb337878dc76b255d385df4e0d6b42d68224bfd786b307590"} Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.666668 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" event={"ID":"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b","Type":"ContainerStarted","Data":"a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae"} Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.667873 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.672511 4666 generic.go:334] "Generic (PLEG): container finished" podID="02046b08-d2b5-444f-901f-6ec4f1195631" containerID="1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4" exitCode=0 Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.672600 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-845b8c48c4-mw2nj" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.672672 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-845b8c48c4-mw2nj" event={"ID":"02046b08-d2b5-444f-901f-6ec4f1195631","Type":"ContainerDied","Data":"1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4"} Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.672737 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-845b8c48c4-mw2nj" event={"ID":"02046b08-d2b5-444f-901f-6ec4f1195631","Type":"ContainerDied","Data":"b62ebdb11ab717354a41edeff089dcf2f34e125a43357293e381a2ada5b9860c"} Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.672810 4666 scope.go:117] "RemoveContainer" containerID="1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.686142 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" podStartSLOduration=3.686128182 podStartE2EDuration="3.686128182s" podCreationTimestamp="2025-12-03 12:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:27.685346091 +0000 UTC m=+1976.530307142" watchObservedRunningTime="2025-12-03 12:46:27.686128182 +0000 UTC m=+1976.531089233" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.706765 4666 scope.go:117] "RemoveContainer" containerID="9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.717196 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-845b8c48c4-mw2nj"] Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.726827 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-845b8c48c4-mw2nj"] Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.733955 4666 scope.go:117] "RemoveContainer" containerID="1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4" Dec 03 12:46:27 crc kubenswrapper[4666]: E1203 12:46:27.737303 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4\": container with ID starting with 1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4 not found: ID does not exist" containerID="1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.737361 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4"} err="failed to get container status \"1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4\": rpc error: code = NotFound desc = could not find container \"1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4\": container with ID starting with 1bae44b906c40e9b44918ca41da8467aa28b6fc149fb3af6b967d2dd6c55ceb4 not found: ID does not exist" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.737392 4666 scope.go:117] "RemoveContainer" containerID="9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f" Dec 03 12:46:27 crc kubenswrapper[4666]: E1203 12:46:27.737788 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f\": container with ID starting with 9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f not found: ID does not exist" containerID="9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f" Dec 03 12:46:27 crc kubenswrapper[4666]: I1203 12:46:27.737813 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f"} err="failed to get container status \"9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f\": rpc error: code = NotFound desc = could not find container \"9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f\": container with ID starting with 9f984edc6185da2edddf95d4d9b32726f7cc1dd11f46f9a864205ac559cd4c7f not found: ID does not exist" Dec 03 12:46:28 crc kubenswrapper[4666]: I1203 12:46:28.101025 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:46:28 crc kubenswrapper[4666]: I1203 12:46:28.438080 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:28 crc kubenswrapper[4666]: I1203 12:46:28.683112 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"59da55ae-1495-452b-93c3-11c21f464d3c","Type":"ContainerStarted","Data":"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d"} Dec 03 12:46:28 crc kubenswrapper[4666]: I1203 12:46:28.683408 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 12:46:28 crc kubenswrapper[4666]: I1203 12:46:28.685318 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b052c5d6-2493-4e71-9df6-3b275aafaf2e","Type":"ContainerStarted","Data":"fcf602955588575ad2c78008cd23aa0089e9e10a28fde323f9c07a864a9643e2"} Dec 03 12:46:28 crc kubenswrapper[4666]: I1203 12:46:28.726837 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.615560784 podStartE2EDuration="4.726820103s" podCreationTimestamp="2025-12-03 12:46:24 +0000 UTC" firstStartedPulling="2025-12-03 12:46:25.752426367 +0000 UTC m=+1974.597387428" lastFinishedPulling="2025-12-03 12:46:26.863685696 +0000 UTC m=+1975.708646747" observedRunningTime="2025-12-03 12:46:28.721405757 +0000 UTC m=+1977.566366808" watchObservedRunningTime="2025-12-03 12:46:28.726820103 +0000 UTC m=+1977.571781154" Dec 03 12:46:28 crc kubenswrapper[4666]: I1203 12:46:28.726951 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.7269368370000002 podStartE2EDuration="3.726936837s" podCreationTimestamp="2025-12-03 12:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:28.701484619 +0000 UTC m=+1977.546445660" watchObservedRunningTime="2025-12-03 12:46:28.726936837 +0000 UTC m=+1977.571897898" Dec 03 12:46:29 crc kubenswrapper[4666]: I1203 12:46:29.450676 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" path="/var/lib/kubelet/pods/02046b08-d2b5-444f-901f-6ec4f1195631/volumes" Dec 03 12:46:29 crc kubenswrapper[4666]: I1203 12:46:29.694255 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api-log" containerID="cri-o://a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2" gracePeriod=30 Dec 03 12:46:29 crc kubenswrapper[4666]: I1203 12:46:29.694314 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api" containerID="cri-o://7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d" gracePeriod=30 Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.211830 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-96d8bfbbf-pd9x2" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.262401 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.299523 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f668b5844-p5d4j"] Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.299972 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f668b5844-p5d4j" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-api" containerID="cri-o://73b6f33d0e7a54b88144cef186c00a72458817ce9202a1dc023b25d7fd4d7b7c" gracePeriod=30 Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.300358 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f668b5844-p5d4j" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-httpd" containerID="cri-o://520a3b7cf2298fc23b64bd8ad01b04fc869cc401e5a7c32b14bc42746e3fe200" gracePeriod=30 Dec 03 12:46:30 crc kubenswrapper[4666]: E1203 12:46:30.645767 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f658bc5_7c9c_4534_8554_cb34af6b5a8b.slice/crio-conmon-520a3b7cf2298fc23b64bd8ad01b04fc869cc401e5a7c32b14bc42746e3fe200.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f658bc5_7c9c_4534_8554_cb34af6b5a8b.slice/crio-520a3b7cf2298fc23b64bd8ad01b04fc869cc401e5a7c32b14bc42746e3fe200.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.683227 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.719808 4666 generic.go:334] "Generic (PLEG): container finished" podID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerID="520a3b7cf2298fc23b64bd8ad01b04fc869cc401e5a7c32b14bc42746e3fe200" exitCode=0 Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.720043 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f668b5844-p5d4j" event={"ID":"0f658bc5-7c9c-4534-8554-cb34af6b5a8b","Type":"ContainerDied","Data":"520a3b7cf2298fc23b64bd8ad01b04fc869cc401e5a7c32b14bc42746e3fe200"} Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.726868 4666 generic.go:334] "Generic (PLEG): container finished" podID="59da55ae-1495-452b-93c3-11c21f464d3c" containerID="7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d" exitCode=0 Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.726896 4666 generic.go:334] "Generic (PLEG): container finished" podID="59da55ae-1495-452b-93c3-11c21f464d3c" containerID="a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2" exitCode=143 Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.727169 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.727211 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"59da55ae-1495-452b-93c3-11c21f464d3c","Type":"ContainerDied","Data":"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d"} Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.727288 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"59da55ae-1495-452b-93c3-11c21f464d3c","Type":"ContainerDied","Data":"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2"} Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.727302 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"59da55ae-1495-452b-93c3-11c21f464d3c","Type":"ContainerDied","Data":"42ed243ea62cf491ae690840e0200f272c0b585f2ca9402401429b4b50d0517f"} Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.727346 4666 scope.go:117] "RemoveContainer" containerID="7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.745930 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59da55ae-1495-452b-93c3-11c21f464d3c-etc-machine-id\") pod \"59da55ae-1495-452b-93c3-11c21f464d3c\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.746324 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data\") pod \"59da55ae-1495-452b-93c3-11c21f464d3c\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.746402 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59da55ae-1495-452b-93c3-11c21f464d3c-logs\") pod \"59da55ae-1495-452b-93c3-11c21f464d3c\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.746439 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-scripts\") pod \"59da55ae-1495-452b-93c3-11c21f464d3c\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.746159 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59da55ae-1495-452b-93c3-11c21f464d3c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "59da55ae-1495-452b-93c3-11c21f464d3c" (UID: "59da55ae-1495-452b-93c3-11c21f464d3c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.746946 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59da55ae-1495-452b-93c3-11c21f464d3c-logs" (OuterVolumeSpecName: "logs") pod "59da55ae-1495-452b-93c3-11c21f464d3c" (UID: "59da55ae-1495-452b-93c3-11c21f464d3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.747422 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-combined-ca-bundle\") pod \"59da55ae-1495-452b-93c3-11c21f464d3c\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.747637 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6gnp\" (UniqueName: \"kubernetes.io/projected/59da55ae-1495-452b-93c3-11c21f464d3c-kube-api-access-h6gnp\") pod \"59da55ae-1495-452b-93c3-11c21f464d3c\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.747735 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data-custom\") pod \"59da55ae-1495-452b-93c3-11c21f464d3c\" (UID: \"59da55ae-1495-452b-93c3-11c21f464d3c\") " Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.748475 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59da55ae-1495-452b-93c3-11c21f464d3c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.748577 4666 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59da55ae-1495-452b-93c3-11c21f464d3c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.752665 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-scripts" (OuterVolumeSpecName: "scripts") pod "59da55ae-1495-452b-93c3-11c21f464d3c" (UID: "59da55ae-1495-452b-93c3-11c21f464d3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.753599 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "59da55ae-1495-452b-93c3-11c21f464d3c" (UID: "59da55ae-1495-452b-93c3-11c21f464d3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.753665 4666 scope.go:117] "RemoveContainer" containerID="a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.755877 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59da55ae-1495-452b-93c3-11c21f464d3c-kube-api-access-h6gnp" (OuterVolumeSpecName: "kube-api-access-h6gnp") pod "59da55ae-1495-452b-93c3-11c21f464d3c" (UID: "59da55ae-1495-452b-93c3-11c21f464d3c"). InnerVolumeSpecName "kube-api-access-h6gnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.774708 4666 scope.go:117] "RemoveContainer" containerID="7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d" Dec 03 12:46:30 crc kubenswrapper[4666]: E1203 12:46:30.775250 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d\": container with ID starting with 7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d not found: ID does not exist" containerID="7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.775287 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d"} err="failed to get container status \"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d\": rpc error: code = NotFound desc = could not find container \"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d\": container with ID starting with 7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d not found: ID does not exist" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.775307 4666 scope.go:117] "RemoveContainer" containerID="a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2" Dec 03 12:46:30 crc kubenswrapper[4666]: E1203 12:46:30.776809 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2\": container with ID starting with a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2 not found: ID does not exist" containerID="a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.776850 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2"} err="failed to get container status \"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2\": rpc error: code = NotFound desc = could not find container \"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2\": container with ID starting with a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2 not found: ID does not exist" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.776882 4666 scope.go:117] "RemoveContainer" containerID="7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.777204 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d"} err="failed to get container status \"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d\": rpc error: code = NotFound desc = could not find container \"7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d\": container with ID starting with 7efa1a23c4f3ae8435b0f1e4e6dc748fe2857a1058e9392fc4dadddd154c124d not found: ID does not exist" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.777243 4666 scope.go:117] "RemoveContainer" containerID="a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.777519 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2"} err="failed to get container status \"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2\": rpc error: code = NotFound desc = could not find container \"a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2\": container with ID starting with a9919d10dd33f224fcabace4fd3c9cb9787df4e3bb3a13039ff5629e32557fe2 not found: ID does not exist" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.784220 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59da55ae-1495-452b-93c3-11c21f464d3c" (UID: "59da55ae-1495-452b-93c3-11c21f464d3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.801248 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data" (OuterVolumeSpecName: "config-data") pod "59da55ae-1495-452b-93c3-11c21f464d3c" (UID: "59da55ae-1495-452b-93c3-11c21f464d3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.851127 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6gnp\" (UniqueName: \"kubernetes.io/projected/59da55ae-1495-452b-93c3-11c21f464d3c-kube-api-access-h6gnp\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.851164 4666 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.851175 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.851186 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:30 crc kubenswrapper[4666]: I1203 12:46:30.851198 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59da55ae-1495-452b-93c3-11c21f464d3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.065541 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.074291 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.091235 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.091684 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.091707 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api" Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.091739 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api-log" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.091747 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api-log" Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.091759 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api-log" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.091766 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api-log" Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.091779 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.091787 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.092032 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api-log" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.092069 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="02046b08-d2b5-444f-901f-6ec4f1195631" containerName="barbican-api" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.092080 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.092109 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" containerName="cinder-api-log" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.093336 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.096510 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.096715 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.099963 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.106051 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.155905 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.155948 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9jrs\" (UniqueName: \"kubernetes.io/projected/58b62594-91e8-4cc7-8076-094fba5bcc66-kube-api-access-g9jrs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.155979 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-config-data\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.156015 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58b62594-91e8-4cc7-8076-094fba5bcc66-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.156034 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.156057 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b62594-91e8-4cc7-8076-094fba5bcc66-logs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.156177 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-config-data-custom\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.156287 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-scripts\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.156328 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-public-tls-certs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.257890 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-scripts\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.257942 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-public-tls-certs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258032 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258059 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9jrs\" (UniqueName: \"kubernetes.io/projected/58b62594-91e8-4cc7-8076-094fba5bcc66-kube-api-access-g9jrs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258102 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-config-data\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258142 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58b62594-91e8-4cc7-8076-094fba5bcc66-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258166 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258199 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b62594-91e8-4cc7-8076-094fba5bcc66-logs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258231 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-config-data-custom\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.258515 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58b62594-91e8-4cc7-8076-094fba5bcc66-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.259105 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b62594-91e8-4cc7-8076-094fba5bcc66-logs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.262113 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-scripts\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.262201 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-config-data-custom\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.262779 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.263033 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-public-tls-certs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.263796 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.272955 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b62594-91e8-4cc7-8076-094fba5bcc66-config-data\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.276300 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9jrs\" (UniqueName: \"kubernetes.io/projected/58b62594-91e8-4cc7-8076-094fba5bcc66-kube-api-access-g9jrs\") pod \"cinder-api-0\" (UID: \"58b62594-91e8-4cc7-8076-094fba5bcc66\") " pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.358808 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.360058 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.362407 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.363105 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-ds9kk" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.363271 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.380829 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.409496 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.463682 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.464124 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.464202 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config-secret\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.464285 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmr9p\" (UniqueName: \"kubernetes.io/projected/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-kube-api-access-rmr9p\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.472784 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59da55ae-1495-452b-93c3-11c21f464d3c" path="/var/lib/kubelet/pods/59da55ae-1495-452b-93c3-11c21f464d3c/volumes" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.566466 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.566550 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.566600 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config-secret\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: W1203 12:46:31.570216 4666 reflector.go:561] object-"openstack"/"openstack-config-secret": failed to list *v1.Secret: secrets "openstack-config-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.570263 4666 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-config-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-config-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.570295 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.571315 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmr9p\" (UniqueName: \"kubernetes.io/projected/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-kube-api-access-rmr9p\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.572070 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.572878 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-rmr9p openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="3db2a185-5bc3-436b-adc9-e696b4ed3bc7" Dec 03 12:46:31 crc kubenswrapper[4666]: W1203 12:46:31.574896 4666 reflector.go:561] object-"openstack"/"openstack-config": failed to list *v1.ConfigMap: configmaps "openstack-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.574929 4666 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openstack-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.577055 4666 projected.go:194] Error preparing data for projected volume kube-api-access-rmr9p for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 12:46:31 crc kubenswrapper[4666]: E1203 12:46:31.577141 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-kube-api-access-rmr9p podName:3db2a185-5bc3-436b-adc9-e696b4ed3bc7 nodeName:}" failed. No retries permitted until 2025-12-03 12:46:32.077118727 +0000 UTC m=+1980.922079778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rmr9p" (UniqueName: "kubernetes.io/projected/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-kube-api-access-rmr9p") pod "openstackclient" (UID: "3db2a185-5bc3-436b-adc9-e696b4ed3bc7") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.579941 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.635899 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.637170 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.648951 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.736025 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.739611 4666 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3db2a185-5bc3-436b-adc9-e696b4ed3bc7" podUID="a7c6b242-ba03-4e43-9061-e908c5af1c78" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.745609 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.774321 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgs9s\" (UniqueName: \"kubernetes.io/projected/a7c6b242-ba03-4e43-9061-e908c5af1c78-kube-api-access-tgs9s\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.774405 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.774454 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.774487 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.875316 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-combined-ca-bundle\") pod \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\" (UID: \"3db2a185-5bc3-436b-adc9-e696b4ed3bc7\") " Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.875820 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.875861 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.875891 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.875992 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgs9s\" (UniqueName: \"kubernetes.io/projected/a7c6b242-ba03-4e43-9061-e908c5af1c78-kube-api-access-tgs9s\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.876110 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmr9p\" (UniqueName: \"kubernetes.io/projected/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-kube-api-access-rmr9p\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.881437 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.881526 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db2a185-5bc3-436b-adc9-e696b4ed3bc7" (UID: "3db2a185-5bc3-436b-adc9-e696b4ed3bc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.901618 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgs9s\" (UniqueName: \"kubernetes.io/projected/a7c6b242-ba03-4e43-9061-e908c5af1c78-kube-api-access-tgs9s\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.948540 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 12:46:31 crc kubenswrapper[4666]: I1203 12:46:31.978272 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.568515 4666 secret.go:188] Couldn't get secret openstack/openstack-config-secret: failed to sync secret cache: timed out waiting for the condition Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.568975 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config-secret podName:3db2a185-5bc3-436b-adc9-e696b4ed3bc7 nodeName:}" failed. No retries permitted until 2025-12-03 12:46:33.06895124 +0000 UTC m=+1981.913912291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config-secret") pod "openstackclient" (UID: "3db2a185-5bc3-436b-adc9-e696b4ed3bc7") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.572975 4666 configmap.go:193] Couldn't get configMap openstack/openstack-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.573046 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config podName:3db2a185-5bc3-436b-adc9-e696b4ed3bc7 nodeName:}" failed. No retries permitted until 2025-12-03 12:46:33.07301904 +0000 UTC m=+1981.917980091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config") pod "openstackclient" (UID: "3db2a185-5bc3-436b-adc9-e696b4ed3bc7") : failed to sync configmap cache: timed out waiting for the condition Dec 03 12:46:32 crc kubenswrapper[4666]: I1203 12:46:32.595503 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:32 crc kubenswrapper[4666]: I1203 12:46:32.595537 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3db2a185-5bc3-436b-adc9-e696b4ed3bc7-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:32 crc kubenswrapper[4666]: I1203 12:46:32.747228 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 12:46:32 crc kubenswrapper[4666]: I1203 12:46:32.753269 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58b62594-91e8-4cc7-8076-094fba5bcc66","Type":"ContainerStarted","Data":"5801db87b0803d9e20b0ba367fc7df989bb90a6eaa23aac46119dbade73b446b"} Dec 03 12:46:32 crc kubenswrapper[4666]: I1203 12:46:32.753310 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58b62594-91e8-4cc7-8076-094fba5bcc66","Type":"ContainerStarted","Data":"1cccf76778d6aff9343110e1f7f83cafb74cee23f1ff2748b002af8b59f5866f"} Dec 03 12:46:32 crc kubenswrapper[4666]: I1203 12:46:32.756392 4666 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3db2a185-5bc3-436b-adc9-e696b4ed3bc7" podUID="a7c6b242-ba03-4e43-9061-e908c5af1c78" Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.877433 4666 configmap.go:193] Couldn't get configMap openstack/openstack-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.877518 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config podName:a7c6b242-ba03-4e43-9061-e908c5af1c78 nodeName:}" failed. No retries permitted until 2025-12-03 12:46:33.377499674 +0000 UTC m=+1982.222460726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config") pod "openstackclient" (UID: "a7c6b242-ba03-4e43-9061-e908c5af1c78") : failed to sync configmap cache: timed out waiting for the condition Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.877762 4666 secret.go:188] Couldn't get secret openstack/openstack-config-secret: failed to sync secret cache: timed out waiting for the condition Dec 03 12:46:32 crc kubenswrapper[4666]: E1203 12:46:32.877819 4666 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config-secret podName:a7c6b242-ba03-4e43-9061-e908c5af1c78 nodeName:}" failed. No retries permitted until 2025-12-03 12:46:33.377808063 +0000 UTC m=+1982.222769114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config-secret") pod "openstackclient" (UID: "a7c6b242-ba03-4e43-9061-e908c5af1c78") : failed to sync secret cache: timed out waiting for the condition Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.408906 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.409413 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.436294 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db2a185-5bc3-436b-adc9-e696b4ed3bc7" path="/var/lib/kubelet/pods/3db2a185-5bc3-436b-adc9-e696b4ed3bc7/volumes" Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.457052 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.461185 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.758217 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58b62594-91e8-4cc7-8076-094fba5bcc66","Type":"ContainerStarted","Data":"f76ef71e7c22643201b00d57d17e2e7c9c47f41efe8e2c70af0e31973e79fc3f"} Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.759375 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 12:46:33 crc kubenswrapper[4666]: I1203 12:46:33.779571 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.779551882 podStartE2EDuration="2.779551882s" podCreationTimestamp="2025-12-03 12:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:33.774018542 +0000 UTC m=+1982.618979593" watchObservedRunningTime="2025-12-03 12:46:33.779551882 +0000 UTC m=+1982.624512933" Dec 03 12:46:34 crc kubenswrapper[4666]: I1203 12:46:34.278804 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 12:46:34 crc kubenswrapper[4666]: I1203 12:46:34.287751 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7c6b242-ba03-4e43-9061-e908c5af1c78-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7c6b242-ba03-4e43-9061-e908c5af1c78\") " pod="openstack/openstackclient" Dec 03 12:46:34 crc kubenswrapper[4666]: I1203 12:46:34.368142 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-ds9kk" Dec 03 12:46:34 crc kubenswrapper[4666]: I1203 12:46:34.376886 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 12:46:34 crc kubenswrapper[4666]: I1203 12:46:34.825403 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.351153 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.445382 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-vw4kp"] Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.445625 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" podUID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerName="dnsmasq-dns" containerID="cri-o://4b67c7f22d7011471b83ac6b8adee6d1f1498093b2b80b638cdb7921d1495935" gracePeriod=10 Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.530918 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.601568 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.784346 4666 generic.go:334] "Generic (PLEG): container finished" podID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerID="73b6f33d0e7a54b88144cef186c00a72458817ce9202a1dc023b25d7fd4d7b7c" exitCode=0 Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.784426 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f668b5844-p5d4j" event={"ID":"0f658bc5-7c9c-4534-8554-cb34af6b5a8b","Type":"ContainerDied","Data":"73b6f33d0e7a54b88144cef186c00a72458817ce9202a1dc023b25d7fd4d7b7c"} Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.788497 4666 generic.go:334] "Generic (PLEG): container finished" podID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerID="4b67c7f22d7011471b83ac6b8adee6d1f1498093b2b80b638cdb7921d1495935" exitCode=0 Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.788584 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" event={"ID":"78ed53b7-3b15-46aa-a024-8f1b69d62469","Type":"ContainerDied","Data":"4b67c7f22d7011471b83ac6b8adee6d1f1498093b2b80b638cdb7921d1495935"} Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.791231 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a7c6b242-ba03-4e43-9061-e908c5af1c78","Type":"ContainerStarted","Data":"d7a36ba008e0c44af51045de9ad82532642befdd443327b0d6363ebb5a8f03e7"} Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.791266 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="cinder-scheduler" containerID="cri-o://4ec8c3c0bd75bd8eb337878dc76b255d385df4e0d6b42d68224bfd786b307590" gracePeriod=30 Dec 03 12:46:35 crc kubenswrapper[4666]: I1203 12:46:35.791363 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="probe" containerID="cri-o://fcf602955588575ad2c78008cd23aa0089e9e10a28fde323f9c07a864a9643e2" gracePeriod=30 Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.075466 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.126475 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.173694 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-sb\") pod \"78ed53b7-3b15-46aa-a024-8f1b69d62469\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.173765 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqr5k\" (UniqueName: \"kubernetes.io/projected/78ed53b7-3b15-46aa-a024-8f1b69d62469-kube-api-access-bqr5k\") pod \"78ed53b7-3b15-46aa-a024-8f1b69d62469\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.173788 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-nb\") pod \"78ed53b7-3b15-46aa-a024-8f1b69d62469\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.173810 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-dns-svc\") pod \"78ed53b7-3b15-46aa-a024-8f1b69d62469\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.173885 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-config\") pod \"78ed53b7-3b15-46aa-a024-8f1b69d62469\" (UID: \"78ed53b7-3b15-46aa-a024-8f1b69d62469\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.181853 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ed53b7-3b15-46aa-a024-8f1b69d62469-kube-api-access-bqr5k" (OuterVolumeSpecName: "kube-api-access-bqr5k") pod "78ed53b7-3b15-46aa-a024-8f1b69d62469" (UID: "78ed53b7-3b15-46aa-a024-8f1b69d62469"). InnerVolumeSpecName "kube-api-access-bqr5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.221265 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "78ed53b7-3b15-46aa-a024-8f1b69d62469" (UID: "78ed53b7-3b15-46aa-a024-8f1b69d62469"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.223654 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-config" (OuterVolumeSpecName: "config") pod "78ed53b7-3b15-46aa-a024-8f1b69d62469" (UID: "78ed53b7-3b15-46aa-a024-8f1b69d62469"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.223998 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78ed53b7-3b15-46aa-a024-8f1b69d62469" (UID: "78ed53b7-3b15-46aa-a024-8f1b69d62469"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.226409 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "78ed53b7-3b15-46aa-a024-8f1b69d62469" (UID: "78ed53b7-3b15-46aa-a024-8f1b69d62469"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.275196 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-httpd-config\") pod \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.275610 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-ovndb-tls-certs\") pod \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276061 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-config\") pod \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276146 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-combined-ca-bundle\") pod \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276301 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wt9\" (UniqueName: \"kubernetes.io/projected/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-kube-api-access-g4wt9\") pod \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\" (UID: \"0f658bc5-7c9c-4534-8554-cb34af6b5a8b\") " Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276758 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276823 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqr5k\" (UniqueName: \"kubernetes.io/projected/78ed53b7-3b15-46aa-a024-8f1b69d62469-kube-api-access-bqr5k\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276837 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276850 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.276860 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ed53b7-3b15-46aa-a024-8f1b69d62469-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.278794 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0f658bc5-7c9c-4534-8554-cb34af6b5a8b" (UID: "0f658bc5-7c9c-4534-8554-cb34af6b5a8b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.279950 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-kube-api-access-g4wt9" (OuterVolumeSpecName: "kube-api-access-g4wt9") pod "0f658bc5-7c9c-4534-8554-cb34af6b5a8b" (UID: "0f658bc5-7c9c-4534-8554-cb34af6b5a8b"). InnerVolumeSpecName "kube-api-access-g4wt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.329313 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-config" (OuterVolumeSpecName: "config") pod "0f658bc5-7c9c-4534-8554-cb34af6b5a8b" (UID: "0f658bc5-7c9c-4534-8554-cb34af6b5a8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.337893 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f658bc5-7c9c-4534-8554-cb34af6b5a8b" (UID: "0f658bc5-7c9c-4534-8554-cb34af6b5a8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.355748 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0f658bc5-7c9c-4534-8554-cb34af6b5a8b" (UID: "0f658bc5-7c9c-4534-8554-cb34af6b5a8b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.377956 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.377987 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.377999 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wt9\" (UniqueName: \"kubernetes.io/projected/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-kube-api-access-g4wt9\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.378009 4666 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.378018 4666 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f658bc5-7c9c-4534-8554-cb34af6b5a8b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.803032 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" event={"ID":"78ed53b7-3b15-46aa-a024-8f1b69d62469","Type":"ContainerDied","Data":"a660b7757ea5002394dbc94ed0286456ef847cb2c941aa93a71d31a5193ad98b"} Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.803053 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-vw4kp" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.803106 4666 scope.go:117] "RemoveContainer" containerID="4b67c7f22d7011471b83ac6b8adee6d1f1498093b2b80b638cdb7921d1495935" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.805905 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f668b5844-p5d4j" event={"ID":"0f658bc5-7c9c-4534-8554-cb34af6b5a8b","Type":"ContainerDied","Data":"819a0a5c0280ce68ee43e3704a738d8ef0a3e1750a6a7c2f1a47d90718da292b"} Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.805951 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f668b5844-p5d4j" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.843376 4666 scope.go:117] "RemoveContainer" containerID="c35483ead349fa2e891c1e6e276cae2be8079e3e1d56d8bc57d1a5fa5e346e17" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.849544 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-vw4kp"] Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.864047 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-vw4kp"] Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.876585 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f668b5844-p5d4j"] Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.879988 4666 scope.go:117] "RemoveContainer" containerID="520a3b7cf2298fc23b64bd8ad01b04fc869cc401e5a7c32b14bc42746e3fe200" Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.883739 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5f668b5844-p5d4j"] Dec 03 12:46:36 crc kubenswrapper[4666]: I1203 12:46:36.916755 4666 scope.go:117] "RemoveContainer" containerID="73b6f33d0e7a54b88144cef186c00a72458817ce9202a1dc023b25d7fd4d7b7c" Dec 03 12:46:37 crc kubenswrapper[4666]: I1203 12:46:37.423826 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:46:37 crc kubenswrapper[4666]: E1203 12:46:37.424060 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:46:37 crc kubenswrapper[4666]: I1203 12:46:37.432731 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" path="/var/lib/kubelet/pods/0f658bc5-7c9c-4534-8554-cb34af6b5a8b/volumes" Dec 03 12:46:37 crc kubenswrapper[4666]: I1203 12:46:37.433308 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ed53b7-3b15-46aa-a024-8f1b69d62469" path="/var/lib/kubelet/pods/78ed53b7-3b15-46aa-a024-8f1b69d62469/volumes" Dec 03 12:46:38 crc kubenswrapper[4666]: I1203 12:46:38.831139 4666 generic.go:334] "Generic (PLEG): container finished" podID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerID="fcf602955588575ad2c78008cd23aa0089e9e10a28fde323f9c07a864a9643e2" exitCode=0 Dec 03 12:46:38 crc kubenswrapper[4666]: I1203 12:46:38.831142 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b052c5d6-2493-4e71-9df6-3b275aafaf2e","Type":"ContainerDied","Data":"fcf602955588575ad2c78008cd23aa0089e9e10a28fde323f9c07a864a9643e2"} Dec 03 12:46:39 crc kubenswrapper[4666]: I1203 12:46:39.156744 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 12:46:39 crc kubenswrapper[4666]: I1203 12:46:39.852722 4666 generic.go:334] "Generic (PLEG): container finished" podID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerID="4ec8c3c0bd75bd8eb337878dc76b255d385df4e0d6b42d68224bfd786b307590" exitCode=0 Dec 03 12:46:39 crc kubenswrapper[4666]: I1203 12:46:39.852769 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b052c5d6-2493-4e71-9df6-3b275aafaf2e","Type":"ContainerDied","Data":"4ec8c3c0bd75bd8eb337878dc76b255d385df4e0d6b42d68224bfd786b307590"} Dec 03 12:46:41 crc kubenswrapper[4666]: I1203 12:46:41.016303 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:46:41 crc kubenswrapper[4666]: I1203 12:46:41.016911 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" containerName="kube-state-metrics" containerID="cri-o://f6bda6c9e80596ed08358252b619f5ef36bb71435df79dd92adbe4e4abbb4789" gracePeriod=30 Dec 03 12:46:41 crc kubenswrapper[4666]: I1203 12:46:41.872544 4666 generic.go:334] "Generic (PLEG): container finished" podID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" containerID="f6bda6c9e80596ed08358252b619f5ef36bb71435df79dd92adbe4e4abbb4789" exitCode=2 Dec 03 12:46:41 crc kubenswrapper[4666]: I1203 12:46:41.872741 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74","Type":"ContainerDied","Data":"f6bda6c9e80596ed08358252b619f5ef36bb71435df79dd92adbe4e4abbb4789"} Dec 03 12:46:42 crc kubenswrapper[4666]: I1203 12:46:42.176692 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:42 crc kubenswrapper[4666]: I1203 12:46:42.177071 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-central-agent" containerID="cri-o://51390f023f6a5e2de380856a2fa0f240602ed0a97d64502e92b41bf31a13357d" gracePeriod=30 Dec 03 12:46:42 crc kubenswrapper[4666]: I1203 12:46:42.177142 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="proxy-httpd" containerID="cri-o://a279afa3c2b9b60a570bc7d6129224f6bf07c1a9935bbf050498966299ae9815" gracePeriod=30 Dec 03 12:46:42 crc kubenswrapper[4666]: I1203 12:46:42.177183 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-notification-agent" containerID="cri-o://ca8ebbbf5482d7074e56ea0dcbb4d06bc5bd74e098ee400d3c2582bb272b8bee" gracePeriod=30 Dec 03 12:46:42 crc kubenswrapper[4666]: I1203 12:46:42.177223 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="sg-core" containerID="cri-o://ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80" gracePeriod=30 Dec 03 12:46:42 crc kubenswrapper[4666]: I1203 12:46:42.211576 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": dial tcp 10.217.0.100:8081: connect: connection refused" Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.923566 4666 generic.go:334] "Generic (PLEG): container finished" podID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerID="a279afa3c2b9b60a570bc7d6129224f6bf07c1a9935bbf050498966299ae9815" exitCode=0 Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.923992 4666 generic.go:334] "Generic (PLEG): container finished" podID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerID="ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80" exitCode=2 Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.924005 4666 generic.go:334] "Generic (PLEG): container finished" podID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerID="ca8ebbbf5482d7074e56ea0dcbb4d06bc5bd74e098ee400d3c2582bb272b8bee" exitCode=0 Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.924016 4666 generic.go:334] "Generic (PLEG): container finished" podID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerID="51390f023f6a5e2de380856a2fa0f240602ed0a97d64502e92b41bf31a13357d" exitCode=0 Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.923626 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerDied","Data":"a279afa3c2b9b60a570bc7d6129224f6bf07c1a9935bbf050498966299ae9815"} Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.924060 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerDied","Data":"ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80"} Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.924079 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerDied","Data":"ca8ebbbf5482d7074e56ea0dcbb4d06bc5bd74e098ee400d3c2582bb272b8bee"} Dec 03 12:46:44 crc kubenswrapper[4666]: I1203 12:46:44.924113 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerDied","Data":"51390f023f6a5e2de380856a2fa0f240602ed0a97d64502e92b41bf31a13357d"} Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.045367 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.819190 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.825991 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.841075 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.934249 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b052c5d6-2493-4e71-9df6-3b275aafaf2e","Type":"ContainerDied","Data":"ea9da3934edad826b06a87821f4ad41f6eb67b350f8e43a4a6d5f2f9d8f50777"} Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.934279 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.934297 4666 scope.go:117] "RemoveContainer" containerID="fcf602955588575ad2c78008cd23aa0089e9e10a28fde323f9c07a864a9643e2" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.936732 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74","Type":"ContainerDied","Data":"9c9d12921ce0e191ed32472380d32cca4f6bb0ba1fdafbc6c4c5b433d6eecf46"} Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.936725 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.941375 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac","Type":"ContainerDied","Data":"8fa1fe75d6deab6ab597248e5602663c568f493bed0c13c2dc4c158b2c86ac3c"} Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.941441 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.993623 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-config-data\") pod \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.993798 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-combined-ca-bundle\") pod \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.993947 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data\") pod \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.993999 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-sg-core-conf-yaml\") pod \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.994125 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-scripts\") pod \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.994197 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffzgb\" (UniqueName: \"kubernetes.io/projected/b052c5d6-2493-4e71-9df6-3b275aafaf2e-kube-api-access-ffzgb\") pod \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.994933 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6c8k\" (UniqueName: \"kubernetes.io/projected/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-kube-api-access-r6c8k\") pod \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.994985 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-log-httpd\") pod \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.995275 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data-custom\") pod \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.995349 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-combined-ca-bundle\") pod \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.995598 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-run-httpd\") pod \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " Dec 03 12:46:45 crc kubenswrapper[4666]: I1203 12:46:45.996294 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b052c5d6-2493-4e71-9df6-3b275aafaf2e-etc-machine-id\") pod \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\" (UID: \"b052c5d6-2493-4e71-9df6-3b275aafaf2e\") " Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:45.996799 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" (UID: "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:45.996978 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" (UID: "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:45.997045 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b052c5d6-2493-4e71-9df6-3b275aafaf2e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b052c5d6-2493-4e71-9df6-3b275aafaf2e" (UID: "b052c5d6-2493-4e71-9df6-3b275aafaf2e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.003446 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-kube-api-access-r6c8k" (OuterVolumeSpecName: "kube-api-access-r6c8k") pod "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" (UID: "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac"). InnerVolumeSpecName "kube-api-access-r6c8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.003583 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-scripts" (OuterVolumeSpecName: "scripts") pod "b052c5d6-2493-4e71-9df6-3b275aafaf2e" (UID: "b052c5d6-2493-4e71-9df6-3b275aafaf2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.003258 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pht\" (UniqueName: \"kubernetes.io/projected/ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74-kube-api-access-67pht\") pod \"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74\" (UID: \"ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74\") " Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.003677 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-scripts\") pod \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\" (UID: \"8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac\") " Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.004350 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b052c5d6-2493-4e71-9df6-3b275aafaf2e" (UID: "b052c5d6-2493-4e71-9df6-3b275aafaf2e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.005804 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b052c5d6-2493-4e71-9df6-3b275aafaf2e-kube-api-access-ffzgb" (OuterVolumeSpecName: "kube-api-access-ffzgb") pod "b052c5d6-2493-4e71-9df6-3b275aafaf2e" (UID: "b052c5d6-2493-4e71-9df6-3b275aafaf2e"). InnerVolumeSpecName "kube-api-access-ffzgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.008697 4666 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.008737 4666 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b052c5d6-2493-4e71-9df6-3b275aafaf2e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.008751 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.008762 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffzgb\" (UniqueName: \"kubernetes.io/projected/b052c5d6-2493-4e71-9df6-3b275aafaf2e-kube-api-access-ffzgb\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.008775 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6c8k\" (UniqueName: \"kubernetes.io/projected/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-kube-api-access-r6c8k\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.008787 4666 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.008798 4666 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.010959 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74-kube-api-access-67pht" (OuterVolumeSpecName: "kube-api-access-67pht") pod "ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" (UID: "ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74"). InnerVolumeSpecName "kube-api-access-67pht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.018390 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-scripts" (OuterVolumeSpecName: "scripts") pod "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" (UID: "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.058800 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" (UID: "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.077643 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b052c5d6-2493-4e71-9df6-3b275aafaf2e" (UID: "b052c5d6-2493-4e71-9df6-3b275aafaf2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.080351 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" (UID: "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.095889 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data" (OuterVolumeSpecName: "config-data") pod "b052c5d6-2493-4e71-9df6-3b275aafaf2e" (UID: "b052c5d6-2493-4e71-9df6-3b275aafaf2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.105596 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-config-data" (OuterVolumeSpecName: "config-data") pod "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" (UID: "8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.110784 4666 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.110892 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.110961 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pht\" (UniqueName: \"kubernetes.io/projected/ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74-kube-api-access-67pht\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.111028 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.111102 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.111173 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.111241 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b052c5d6-2493-4e71-9df6-3b275aafaf2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.274701 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.286377 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.301000 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.309750 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329250 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329585 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-httpd" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329602 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-httpd" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329616 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="probe" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329622 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="probe" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329633 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="proxy-httpd" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329642 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="proxy-httpd" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329653 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="cinder-scheduler" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329658 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="cinder-scheduler" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329668 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" containerName="kube-state-metrics" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329673 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" containerName="kube-state-metrics" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329683 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-central-agent" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329689 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-central-agent" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329698 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="sg-core" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329704 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="sg-core" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329715 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-notification-agent" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329720 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-notification-agent" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329731 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerName="dnsmasq-dns" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329737 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerName="dnsmasq-dns" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329749 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-api" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329755 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-api" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.329772 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerName="init" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329778 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerName="init" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329919 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ed53b7-3b15-46aa-a024-8f1b69d62469" containerName="dnsmasq-dns" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329932 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" containerName="kube-state-metrics" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329938 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="cinder-scheduler" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329945 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="proxy-httpd" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329952 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-httpd" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329964 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="sg-core" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329973 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" containerName="probe" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329982 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-central-agent" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329993 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" containerName="ceilometer-notification-agent" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.329999 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f658bc5-7c9c-4534-8554-cb34af6b5a8b" containerName="neutron-api" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.330801 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.330826 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.330894 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.332799 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.342425 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.343925 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.349736 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.349957 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8zqjm" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.349962 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.359693 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.372958 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.374825 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.377110 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.377398 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.377895 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.409230 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414184 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcwkx\" (UniqueName: \"kubernetes.io/projected/b7e78364-3d2e-435a-a0fb-d85cb2586006-kube-api-access-bcwkx\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414226 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414264 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414297 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414320 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-config-data\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414338 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-scripts\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414362 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414381 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-log-httpd\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414413 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414431 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-run-httpd\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414451 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvsfl\" (UniqueName: \"kubernetes.io/projected/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-api-access-vvsfl\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414485 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-config-data\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414500 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7e78364-3d2e-435a-a0fb-d85cb2586006-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414539 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414556 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414577 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-scripts\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414592 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khzgf\" (UniqueName: \"kubernetes.io/projected/f428b93f-6490-4285-9d98-37601fe3bd86-kube-api-access-khzgf\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.414609 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.423122 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516256 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516330 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-run-httpd\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516374 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvsfl\" (UniqueName: \"kubernetes.io/projected/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-api-access-vvsfl\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516440 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-config-data\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516473 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7e78364-3d2e-435a-a0fb-d85cb2586006-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516560 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516594 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516643 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khzgf\" (UniqueName: \"kubernetes.io/projected/f428b93f-6490-4285-9d98-37601fe3bd86-kube-api-access-khzgf\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516673 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-scripts\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516705 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516747 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcwkx\" (UniqueName: \"kubernetes.io/projected/b7e78364-3d2e-435a-a0fb-d85cb2586006-kube-api-access-bcwkx\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516785 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516833 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516880 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516917 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-config-data\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516953 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-scripts\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.516998 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.517039 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-log-httpd\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.517460 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7e78364-3d2e-435a-a0fb-d85cb2586006-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.519617 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-log-httpd\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.520169 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-run-httpd\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.527854 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.531683 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.534840 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.537820 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.537976 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.538640 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-scripts\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.539042 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.540646 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcwkx\" (UniqueName: \"kubernetes.io/projected/b7e78364-3d2e-435a-a0fb-d85cb2586006-kube-api-access-bcwkx\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.541138 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.541214 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-config-data\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.541488 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvsfl\" (UniqueName: \"kubernetes.io/projected/d2050e43-459e-42d5-ae48-1e8e03dd089f-kube-api-access-vvsfl\") pod \"kube-state-metrics-0\" (UID: \"d2050e43-459e-42d5-ae48-1e8e03dd089f\") " pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.542277 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-config-data\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.542973 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khzgf\" (UniqueName: \"kubernetes.io/projected/f428b93f-6490-4285-9d98-37601fe3bd86-kube-api-access-khzgf\") pod \"ceilometer-0\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.543076 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-scripts\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.543381 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7e78364-3d2e-435a-a0fb-d85cb2586006-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b7e78364-3d2e-435a-a0fb-d85cb2586006\") " pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.679762 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.692067 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.696153 4666 scope.go:117] "RemoveContainer" containerID="4ec8c3c0bd75bd8eb337878dc76b255d385df4e0d6b42d68224bfd786b307590" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.709788 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.737154 4666 scope.go:117] "RemoveContainer" containerID="f6bda6c9e80596ed08358252b619f5ef36bb71435df79dd92adbe4e4abbb4789" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.807114 4666 scope.go:117] "RemoveContainer" containerID="a279afa3c2b9b60a570bc7d6129224f6bf07c1a9935bbf050498966299ae9815" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.863392 4666 scope.go:117] "RemoveContainer" containerID="ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80" Dec 03 12:46:46 crc kubenswrapper[4666]: E1203 12:46:46.972610 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80\": container with ID starting with ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80 not found: ID does not exist" containerID="ecb7f361d74f9bc4e664c6298abeb2c9a9919fc1ed9f0673bf483276a3a51f80" Dec 03 12:46:46 crc kubenswrapper[4666]: I1203 12:46:46.972673 4666 scope.go:117] "RemoveContainer" containerID="ca8ebbbf5482d7074e56ea0dcbb4d06bc5bd74e098ee400d3c2582bb272b8bee" Dec 03 12:46:47 crc kubenswrapper[4666]: I1203 12:46:47.117122 4666 scope.go:117] "RemoveContainer" containerID="51390f023f6a5e2de380856a2fa0f240602ed0a97d64502e92b41bf31a13357d" Dec 03 12:46:47 crc kubenswrapper[4666]: I1203 12:46:47.356024 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 12:46:47 crc kubenswrapper[4666]: W1203 12:46:47.358913 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf428b93f_6490_4285_9d98_37601fe3bd86.slice/crio-15458b044af39df42c1708ebd5265df62e8888a36180ecd13c9691763e6f859d WatchSource:0}: Error finding container 15458b044af39df42c1708ebd5265df62e8888a36180ecd13c9691763e6f859d: Status 404 returned error can't find the container with id 15458b044af39df42c1708ebd5265df62e8888a36180ecd13c9691763e6f859d Dec 03 12:46:47 crc kubenswrapper[4666]: I1203 12:46:47.367511 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:46:47 crc kubenswrapper[4666]: I1203 12:46:47.441910 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac" path="/var/lib/kubelet/pods/8cff28e2-a79e-4a5f-8adc-a72a3e03f8ac/volumes" Dec 03 12:46:47 crc kubenswrapper[4666]: I1203 12:46:47.443149 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b052c5d6-2493-4e71-9df6-3b275aafaf2e" path="/var/lib/kubelet/pods/b052c5d6-2493-4e71-9df6-3b275aafaf2e/volumes" Dec 03 12:46:47 crc kubenswrapper[4666]: I1203 12:46:47.444278 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74" path="/var/lib/kubelet/pods/ee64b2c1-ef7e-4b8b-a793-8bca0e2eda74/volumes" Dec 03 12:46:47 crc kubenswrapper[4666]: I1203 12:46:47.719402 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 12:46:47 crc kubenswrapper[4666]: W1203 12:46:47.722272 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2050e43_459e_42d5_ae48_1e8e03dd089f.slice/crio-1f33e049c1eeeecea2905272dde759cc639531e403630e919e72291a747dd39d WatchSource:0}: Error finding container 1f33e049c1eeeecea2905272dde759cc639531e403630e919e72291a747dd39d: Status 404 returned error can't find the container with id 1f33e049c1eeeecea2905272dde759cc639531e403630e919e72291a747dd39d Dec 03 12:46:48 crc kubenswrapper[4666]: I1203 12:46:48.050865 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerStarted","Data":"15458b044af39df42c1708ebd5265df62e8888a36180ecd13c9691763e6f859d"} Dec 03 12:46:48 crc kubenswrapper[4666]: I1203 12:46:48.053372 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d2050e43-459e-42d5-ae48-1e8e03dd089f","Type":"ContainerStarted","Data":"1f33e049c1eeeecea2905272dde759cc639531e403630e919e72291a747dd39d"} Dec 03 12:46:48 crc kubenswrapper[4666]: I1203 12:46:48.055982 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7e78364-3d2e-435a-a0fb-d85cb2586006","Type":"ContainerStarted","Data":"c86b3f64476d395ce4219375ef61381d1e1dd7de919f48a9d874490e8237c581"} Dec 03 12:46:49 crc kubenswrapper[4666]: I1203 12:46:49.068025 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7e78364-3d2e-435a-a0fb-d85cb2586006","Type":"ContainerStarted","Data":"696f9521588e1a326e2c191360fa881785f5ce32ac438b04c46569ed66eea7e6"} Dec 03 12:46:50 crc kubenswrapper[4666]: I1203 12:46:50.425566 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:46:52 crc kubenswrapper[4666]: I1203 12:46:52.104996 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"baccbee2e06aeeb4d11e51e541c48984c3da2edee76ae78a953b9be250435078"} Dec 03 12:46:53 crc kubenswrapper[4666]: I1203 12:46:53.113817 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7e78364-3d2e-435a-a0fb-d85cb2586006","Type":"ContainerStarted","Data":"022ffbbfd32e9223d5cb40b842aa581d479f37e73850d76f0572cde2339fdcab"} Dec 03 12:46:54 crc kubenswrapper[4666]: I1203 12:46:54.128556 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerStarted","Data":"e2f68816cce404140490c567e04e46078cbe6103eea95ca2fabeb77ef52abfc8"} Dec 03 12:46:54 crc kubenswrapper[4666]: I1203 12:46:54.131922 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d2050e43-459e-42d5-ae48-1e8e03dd089f","Type":"ContainerStarted","Data":"81a3fca746c7db0ce667bb8eff615641e1ebe89d09080725262c08344aa945a8"} Dec 03 12:46:54 crc kubenswrapper[4666]: I1203 12:46:54.132374 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 12:46:54 crc kubenswrapper[4666]: I1203 12:46:54.136807 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a7c6b242-ba03-4e43-9061-e908c5af1c78","Type":"ContainerStarted","Data":"ec8566b4a66b0ba5d3eac2f2127e026a87b8ac1f3cc5a8828d0e416f2a65b7b0"} Dec 03 12:46:54 crc kubenswrapper[4666]: I1203 12:46:54.156469 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.570468613 podStartE2EDuration="8.156444695s" podCreationTimestamp="2025-12-03 12:46:46 +0000 UTC" firstStartedPulling="2025-12-03 12:46:47.726127026 +0000 UTC m=+1996.571088087" lastFinishedPulling="2025-12-03 12:46:53.312103128 +0000 UTC m=+2002.157064169" observedRunningTime="2025-12-03 12:46:54.153839435 +0000 UTC m=+2002.998800506" watchObservedRunningTime="2025-12-03 12:46:54.156444695 +0000 UTC m=+2003.001405786" Dec 03 12:46:54 crc kubenswrapper[4666]: I1203 12:46:54.165020 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.164993256 podStartE2EDuration="8.164993256s" podCreationTimestamp="2025-12-03 12:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:46:53.170103972 +0000 UTC m=+2002.015065023" watchObservedRunningTime="2025-12-03 12:46:54.164993256 +0000 UTC m=+2003.009954337" Dec 03 12:46:54 crc kubenswrapper[4666]: I1203 12:46:54.175582 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=11.238662394 podStartE2EDuration="23.175567361s" podCreationTimestamp="2025-12-03 12:46:31 +0000 UTC" firstStartedPulling="2025-12-03 12:46:34.833679017 +0000 UTC m=+1983.678640068" lastFinishedPulling="2025-12-03 12:46:46.770583984 +0000 UTC m=+1995.615545035" observedRunningTime="2025-12-03 12:46:54.173549237 +0000 UTC m=+2003.018510358" watchObservedRunningTime="2025-12-03 12:46:54.175567361 +0000 UTC m=+2003.020528402" Dec 03 12:46:56 crc kubenswrapper[4666]: I1203 12:46:56.680579 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.354564 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8b8q"] Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.356251 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.369856 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8b8q"] Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.371064 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmpn\" (UniqueName: \"kubernetes.io/projected/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-kube-api-access-sfmpn\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.371528 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-utilities\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.371631 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-catalog-content\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.472853 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmpn\" (UniqueName: \"kubernetes.io/projected/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-kube-api-access-sfmpn\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.472944 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-utilities\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.473035 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-catalog-content\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.473794 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-utilities\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.473802 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-catalog-content\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.501965 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmpn\" (UniqueName: \"kubernetes.io/projected/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-kube-api-access-sfmpn\") pod \"redhat-marketplace-l8b8q\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.662424 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 12:46:58 crc kubenswrapper[4666]: I1203 12:46:58.687050 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:46:59 crc kubenswrapper[4666]: W1203 12:46:59.438304 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e18c2f_10f5_4c7b_90d6_b4232e1eee13.slice/crio-c65c2d51366ef7cc996ea24bb860ae8c12d64210e966ec135bfbf6cf918aa88f WatchSource:0}: Error finding container c65c2d51366ef7cc996ea24bb860ae8c12d64210e966ec135bfbf6cf918aa88f: Status 404 returned error can't find the container with id c65c2d51366ef7cc996ea24bb860ae8c12d64210e966ec135bfbf6cf918aa88f Dec 03 12:46:59 crc kubenswrapper[4666]: I1203 12:46:59.439168 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8b8q"] Dec 03 12:47:00 crc kubenswrapper[4666]: I1203 12:47:00.192377 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerStarted","Data":"24637595225f333314300e7f866c03d7b1d865da514d7dd5be9a19c614cd1c9f"} Dec 03 12:47:00 crc kubenswrapper[4666]: I1203 12:47:00.194663 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8b8q" event={"ID":"12e18c2f-10f5-4c7b-90d6-b4232e1eee13","Type":"ContainerStarted","Data":"c65c2d51366ef7cc996ea24bb860ae8c12d64210e966ec135bfbf6cf918aa88f"} Dec 03 12:47:01 crc kubenswrapper[4666]: I1203 12:47:01.210001 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerStarted","Data":"d19a4faaa24af983517304a98f56a8addb84d83e79a8a4a46b6ba596180e23b5"} Dec 03 12:47:01 crc kubenswrapper[4666]: I1203 12:47:01.212039 4666 generic.go:334] "Generic (PLEG): container finished" podID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerID="5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866" exitCode=0 Dec 03 12:47:01 crc kubenswrapper[4666]: I1203 12:47:01.212080 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8b8q" event={"ID":"12e18c2f-10f5-4c7b-90d6-b4232e1eee13","Type":"ContainerDied","Data":"5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866"} Dec 03 12:47:01 crc kubenswrapper[4666]: I1203 12:47:01.692577 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.194216 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-26qkc"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.196166 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.207665 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-26qkc"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.233313 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerStarted","Data":"0bacdd97ee7b4a75826c538e856cd90edcc50b5fc0b0c7715f0e4a541bed09ca"} Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.233466 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-central-agent" containerID="cri-o://e2f68816cce404140490c567e04e46078cbe6103eea95ca2fabeb77ef52abfc8" gracePeriod=30 Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.233627 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.233634 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="sg-core" containerID="cri-o://d19a4faaa24af983517304a98f56a8addb84d83e79a8a4a46b6ba596180e23b5" gracePeriod=30 Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.233690 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-notification-agent" containerID="cri-o://24637595225f333314300e7f866c03d7b1d865da514d7dd5be9a19c614cd1c9f" gracePeriod=30 Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.233766 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="proxy-httpd" containerID="cri-o://0bacdd97ee7b4a75826c538e856cd90edcc50b5fc0b0c7715f0e4a541bed09ca" gracePeriod=30 Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.245522 4666 generic.go:334] "Generic (PLEG): container finished" podID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerID="80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8" exitCode=0 Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.245562 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8b8q" event={"ID":"12e18c2f-10f5-4c7b-90d6-b4232e1eee13","Type":"ContainerDied","Data":"80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8"} Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.264174 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.15907168 podStartE2EDuration="17.264057715s" podCreationTimestamp="2025-12-03 12:46:46 +0000 UTC" firstStartedPulling="2025-12-03 12:46:47.362245136 +0000 UTC m=+1996.207206207" lastFinishedPulling="2025-12-03 12:47:02.467231191 +0000 UTC m=+2011.312192242" observedRunningTime="2025-12-03 12:47:03.258645209 +0000 UTC m=+2012.103606280" watchObservedRunningTime="2025-12-03 12:47:03.264057715 +0000 UTC m=+2012.109018766" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.311329 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4f48w"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.312415 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.319789 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4f48w"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.362049 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/502dbfff-6c76-448e-aff9-db535351f22f-operator-scripts\") pod \"nova-api-db-create-26qkc\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.362113 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tcwf\" (UniqueName: \"kubernetes.io/projected/502dbfff-6c76-448e-aff9-db535351f22f-kube-api-access-9tcwf\") pod \"nova-api-db-create-26qkc\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.404484 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7596k"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.405828 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.422390 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7596k"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.450906 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b618-account-create-update-2c7dc"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.452104 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.459467 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.459974 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b618-account-create-update-2c7dc"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.465547 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/502dbfff-6c76-448e-aff9-db535351f22f-operator-scripts\") pod \"nova-api-db-create-26qkc\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.465724 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tcwf\" (UniqueName: \"kubernetes.io/projected/502dbfff-6c76-448e-aff9-db535351f22f-kube-api-access-9tcwf\") pod \"nova-api-db-create-26qkc\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.466081 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5xf\" (UniqueName: \"kubernetes.io/projected/54662f34-5ced-49d9-bfba-ddccae72099e-kube-api-access-gx5xf\") pod \"nova-cell0-db-create-4f48w\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.466226 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54662f34-5ced-49d9-bfba-ddccae72099e-operator-scripts\") pod \"nova-cell0-db-create-4f48w\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.466305 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/502dbfff-6c76-448e-aff9-db535351f22f-operator-scripts\") pod \"nova-api-db-create-26qkc\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.496773 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tcwf\" (UniqueName: \"kubernetes.io/projected/502dbfff-6c76-448e-aff9-db535351f22f-kube-api-access-9tcwf\") pod \"nova-api-db-create-26qkc\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.514892 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.567722 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f374154c-83df-4685-8224-aa067097648d-operator-scripts\") pod \"nova-cell1-db-create-7596k\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.567835 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgpp\" (UniqueName: \"kubernetes.io/projected/f374154c-83df-4685-8224-aa067097648d-kube-api-access-xkgpp\") pod \"nova-cell1-db-create-7596k\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.567887 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98004b4-42cb-4eca-9da3-c440aa955f18-operator-scripts\") pod \"nova-api-b618-account-create-update-2c7dc\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.567927 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5xf\" (UniqueName: \"kubernetes.io/projected/54662f34-5ced-49d9-bfba-ddccae72099e-kube-api-access-gx5xf\") pod \"nova-cell0-db-create-4f48w\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.567984 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54662f34-5ced-49d9-bfba-ddccae72099e-operator-scripts\") pod \"nova-cell0-db-create-4f48w\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.568023 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdxwb\" (UniqueName: \"kubernetes.io/projected/b98004b4-42cb-4eca-9da3-c440aa955f18-kube-api-access-pdxwb\") pod \"nova-api-b618-account-create-update-2c7dc\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.568879 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54662f34-5ced-49d9-bfba-ddccae72099e-operator-scripts\") pod \"nova-cell0-db-create-4f48w\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.595515 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5xf\" (UniqueName: \"kubernetes.io/projected/54662f34-5ced-49d9-bfba-ddccae72099e-kube-api-access-gx5xf\") pod \"nova-cell0-db-create-4f48w\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.613816 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3a3c-account-create-update-4qd86"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.615267 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.618359 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.628426 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3a3c-account-create-update-4qd86"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.637635 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.670313 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98004b4-42cb-4eca-9da3-c440aa955f18-operator-scripts\") pod \"nova-api-b618-account-create-update-2c7dc\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.670907 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdxwb\" (UniqueName: \"kubernetes.io/projected/b98004b4-42cb-4eca-9da3-c440aa955f18-kube-api-access-pdxwb\") pod \"nova-api-b618-account-create-update-2c7dc\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.671045 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f374154c-83df-4685-8224-aa067097648d-operator-scripts\") pod \"nova-cell1-db-create-7596k\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.671231 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgpp\" (UniqueName: \"kubernetes.io/projected/f374154c-83df-4685-8224-aa067097648d-kube-api-access-xkgpp\") pod \"nova-cell1-db-create-7596k\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.672820 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f374154c-83df-4685-8224-aa067097648d-operator-scripts\") pod \"nova-cell1-db-create-7596k\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.686048 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98004b4-42cb-4eca-9da3-c440aa955f18-operator-scripts\") pod \"nova-api-b618-account-create-update-2c7dc\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.696557 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgpp\" (UniqueName: \"kubernetes.io/projected/f374154c-83df-4685-8224-aa067097648d-kube-api-access-xkgpp\") pod \"nova-cell1-db-create-7596k\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.697741 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdxwb\" (UniqueName: \"kubernetes.io/projected/b98004b4-42cb-4eca-9da3-c440aa955f18-kube-api-access-pdxwb\") pod \"nova-api-b618-account-create-update-2c7dc\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.735825 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.776013 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baaeae1d-3d78-477a-a46d-80ee1c6447b1-operator-scripts\") pod \"nova-cell0-3a3c-account-create-update-4qd86\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.776062 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2h2r\" (UniqueName: \"kubernetes.io/projected/baaeae1d-3d78-477a-a46d-80ee1c6447b1-kube-api-access-t2h2r\") pod \"nova-cell0-3a3c-account-create-update-4qd86\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.830215 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f103-account-create-update-hscdk"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.831354 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.833640 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.866749 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f103-account-create-update-hscdk"] Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.878314 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baaeae1d-3d78-477a-a46d-80ee1c6447b1-operator-scripts\") pod \"nova-cell0-3a3c-account-create-update-4qd86\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.878369 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2h2r\" (UniqueName: \"kubernetes.io/projected/baaeae1d-3d78-477a-a46d-80ee1c6447b1-kube-api-access-t2h2r\") pod \"nova-cell0-3a3c-account-create-update-4qd86\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.879373 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baaeae1d-3d78-477a-a46d-80ee1c6447b1-operator-scripts\") pod \"nova-cell0-3a3c-account-create-update-4qd86\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.929627 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2h2r\" (UniqueName: \"kubernetes.io/projected/baaeae1d-3d78-477a-a46d-80ee1c6447b1-kube-api-access-t2h2r\") pod \"nova-cell0-3a3c-account-create-update-4qd86\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.938971 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.970930 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.997158 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206ccec-3195-42c1-8c07-b27785183fd7-operator-scripts\") pod \"nova-cell1-f103-account-create-update-hscdk\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:03 crc kubenswrapper[4666]: I1203 12:47:03.997290 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrqfj\" (UniqueName: \"kubernetes.io/projected/d206ccec-3195-42c1-8c07-b27785183fd7-kube-api-access-hrqfj\") pod \"nova-cell1-f103-account-create-update-hscdk\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.013939 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-26qkc"] Dec 03 12:47:04 crc kubenswrapper[4666]: W1203 12:47:04.043064 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod502dbfff_6c76_448e_aff9_db535351f22f.slice/crio-b5f87ebfb436bf77b63c16e27e61f90621bdc2223c025f5d2050f3a408902a34 WatchSource:0}: Error finding container b5f87ebfb436bf77b63c16e27e61f90621bdc2223c025f5d2050f3a408902a34: Status 404 returned error can't find the container with id b5f87ebfb436bf77b63c16e27e61f90621bdc2223c025f5d2050f3a408902a34 Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.104491 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206ccec-3195-42c1-8c07-b27785183fd7-operator-scripts\") pod \"nova-cell1-f103-account-create-update-hscdk\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.104953 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrqfj\" (UniqueName: \"kubernetes.io/projected/d206ccec-3195-42c1-8c07-b27785183fd7-kube-api-access-hrqfj\") pod \"nova-cell1-f103-account-create-update-hscdk\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.105242 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206ccec-3195-42c1-8c07-b27785183fd7-operator-scripts\") pod \"nova-cell1-f103-account-create-update-hscdk\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.123781 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrqfj\" (UniqueName: \"kubernetes.io/projected/d206ccec-3195-42c1-8c07-b27785183fd7-kube-api-access-hrqfj\") pod \"nova-cell1-f103-account-create-update-hscdk\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.150003 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.179311 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4f48w"] Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.266753 4666 generic.go:334] "Generic (PLEG): container finished" podID="f428b93f-6490-4285-9d98-37601fe3bd86" containerID="0bacdd97ee7b4a75826c538e856cd90edcc50b5fc0b0c7715f0e4a541bed09ca" exitCode=0 Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.266789 4666 generic.go:334] "Generic (PLEG): container finished" podID="f428b93f-6490-4285-9d98-37601fe3bd86" containerID="d19a4faaa24af983517304a98f56a8addb84d83e79a8a4a46b6ba596180e23b5" exitCode=2 Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.266838 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerDied","Data":"0bacdd97ee7b4a75826c538e856cd90edcc50b5fc0b0c7715f0e4a541bed09ca"} Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.266866 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerDied","Data":"d19a4faaa24af983517304a98f56a8addb84d83e79a8a4a46b6ba596180e23b5"} Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.268789 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4f48w" event={"ID":"54662f34-5ced-49d9-bfba-ddccae72099e","Type":"ContainerStarted","Data":"24e845e2a4be8a494e52eace559cdcb8941de7a851d1757b1f0ee48ac7b10229"} Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.270218 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-26qkc" event={"ID":"502dbfff-6c76-448e-aff9-db535351f22f","Type":"ContainerStarted","Data":"b5f87ebfb436bf77b63c16e27e61f90621bdc2223c025f5d2050f3a408902a34"} Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.374720 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7596k"] Dec 03 12:47:04 crc kubenswrapper[4666]: W1203 12:47:04.395291 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf374154c_83df_4685_8224_aa067097648d.slice/crio-6d7efb7044671500ff9eee8cfc39e9eb723c65b151b36421a68993b8307a1490 WatchSource:0}: Error finding container 6d7efb7044671500ff9eee8cfc39e9eb723c65b151b36421a68993b8307a1490: Status 404 returned error can't find the container with id 6d7efb7044671500ff9eee8cfc39e9eb723c65b151b36421a68993b8307a1490 Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.463950 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b618-account-create-update-2c7dc"] Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.529708 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3a3c-account-create-update-4qd86"] Dec 03 12:47:04 crc kubenswrapper[4666]: I1203 12:47:04.780100 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f103-account-create-update-hscdk"] Dec 03 12:47:04 crc kubenswrapper[4666]: W1203 12:47:04.782565 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd206ccec_3195_42c1_8c07_b27785183fd7.slice/crio-1605e4c6e22f3ff61f31705f43b1f0c486d43c2f19f4c6935af4de365aa0baa0 WatchSource:0}: Error finding container 1605e4c6e22f3ff61f31705f43b1f0c486d43c2f19f4c6935af4de365aa0baa0: Status 404 returned error can't find the container with id 1605e4c6e22f3ff61f31705f43b1f0c486d43c2f19f4c6935af4de365aa0baa0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.293388 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8b8q" event={"ID":"12e18c2f-10f5-4c7b-90d6-b4232e1eee13","Type":"ContainerStarted","Data":"0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.298732 4666 generic.go:334] "Generic (PLEG): container finished" podID="f374154c-83df-4685-8224-aa067097648d" containerID="39f40f2c6efbd7a882478c5e9843c1e6358a05338bfbb88bae7015f53aac9aa9" exitCode=0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.298818 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7596k" event={"ID":"f374154c-83df-4685-8224-aa067097648d","Type":"ContainerDied","Data":"39f40f2c6efbd7a882478c5e9843c1e6358a05338bfbb88bae7015f53aac9aa9"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.298840 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7596k" event={"ID":"f374154c-83df-4685-8224-aa067097648d","Type":"ContainerStarted","Data":"6d7efb7044671500ff9eee8cfc39e9eb723c65b151b36421a68993b8307a1490"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.300263 4666 generic.go:334] "Generic (PLEG): container finished" podID="baaeae1d-3d78-477a-a46d-80ee1c6447b1" containerID="09db20aae634ddf485f19ff2fe5abc7e0b1ee634a6e4dabc74af3c08dec48e66" exitCode=0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.300301 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" event={"ID":"baaeae1d-3d78-477a-a46d-80ee1c6447b1","Type":"ContainerDied","Data":"09db20aae634ddf485f19ff2fe5abc7e0b1ee634a6e4dabc74af3c08dec48e66"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.300317 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" event={"ID":"baaeae1d-3d78-477a-a46d-80ee1c6447b1","Type":"ContainerStarted","Data":"af8f55801e5cbab6413dbec915e1bb7724ed133259cb8041653db1cb926a73ac"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.306419 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f103-account-create-update-hscdk" event={"ID":"d206ccec-3195-42c1-8c07-b27785183fd7","Type":"ContainerStarted","Data":"26efb468e7ccf5b0be81244d4e786ca27e41fda98463bb1e29cb129b83cb7464"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.306469 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f103-account-create-update-hscdk" event={"ID":"d206ccec-3195-42c1-8c07-b27785183fd7","Type":"ContainerStarted","Data":"1605e4c6e22f3ff61f31705f43b1f0c486d43c2f19f4c6935af4de365aa0baa0"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.323536 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8b8q" podStartSLOduration=3.924386237 podStartE2EDuration="7.323513496s" podCreationTimestamp="2025-12-03 12:46:58 +0000 UTC" firstStartedPulling="2025-12-03 12:47:01.213451503 +0000 UTC m=+2010.058412554" lastFinishedPulling="2025-12-03 12:47:04.612578762 +0000 UTC m=+2013.457539813" observedRunningTime="2025-12-03 12:47:05.313008853 +0000 UTC m=+2014.157969934" watchObservedRunningTime="2025-12-03 12:47:05.323513496 +0000 UTC m=+2014.168474567" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.327656 4666 generic.go:334] "Generic (PLEG): container finished" podID="f428b93f-6490-4285-9d98-37601fe3bd86" containerID="24637595225f333314300e7f866c03d7b1d865da514d7dd5be9a19c614cd1c9f" exitCode=0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.327692 4666 generic.go:334] "Generic (PLEG): container finished" podID="f428b93f-6490-4285-9d98-37601fe3bd86" containerID="e2f68816cce404140490c567e04e46078cbe6103eea95ca2fabeb77ef52abfc8" exitCode=0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.327776 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerDied","Data":"24637595225f333314300e7f866c03d7b1d865da514d7dd5be9a19c614cd1c9f"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.327806 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerDied","Data":"e2f68816cce404140490c567e04e46078cbe6103eea95ca2fabeb77ef52abfc8"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.343212 4666 generic.go:334] "Generic (PLEG): container finished" podID="54662f34-5ced-49d9-bfba-ddccae72099e" containerID="97e1fca69628ee42cca3434b8c8b2ba5795291e2c919a0bcf6e9cc1f1f398d4d" exitCode=0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.343273 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4f48w" event={"ID":"54662f34-5ced-49d9-bfba-ddccae72099e","Type":"ContainerDied","Data":"97e1fca69628ee42cca3434b8c8b2ba5795291e2c919a0bcf6e9cc1f1f398d4d"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.345433 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.353591 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-f103-account-create-update-hscdk" podStartSLOduration=2.353571038 podStartE2EDuration="2.353571038s" podCreationTimestamp="2025-12-03 12:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:05.331443621 +0000 UTC m=+2014.176404672" watchObservedRunningTime="2025-12-03 12:47:05.353571038 +0000 UTC m=+2014.198532109" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.354716 4666 generic.go:334] "Generic (PLEG): container finished" podID="502dbfff-6c76-448e-aff9-db535351f22f" containerID="c7b8e36a55f83333562a64d217284ffeda7eaa01718c5f49529c102007ce49a5" exitCode=0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.354837 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-26qkc" event={"ID":"502dbfff-6c76-448e-aff9-db535351f22f","Type":"ContainerDied","Data":"c7b8e36a55f83333562a64d217284ffeda7eaa01718c5f49529c102007ce49a5"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.363601 4666 generic.go:334] "Generic (PLEG): container finished" podID="b98004b4-42cb-4eca-9da3-c440aa955f18" containerID="a2073092072ec6c89786ae40717353619657ea8ccd9a68f703b28c6092d09428" exitCode=0 Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.363781 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b618-account-create-update-2c7dc" event={"ID":"b98004b4-42cb-4eca-9da3-c440aa955f18","Type":"ContainerDied","Data":"a2073092072ec6c89786ae40717353619657ea8ccd9a68f703b28c6092d09428"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.363882 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b618-account-create-update-2c7dc" event={"ID":"b98004b4-42cb-4eca-9da3-c440aa955f18","Type":"ContainerStarted","Data":"38b8fd62030824b6a7bc663093b0f8025e33f15dfe2c2581d2a68eeea5bd9b99"} Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534220 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khzgf\" (UniqueName: \"kubernetes.io/projected/f428b93f-6490-4285-9d98-37601fe3bd86-kube-api-access-khzgf\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534637 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-scripts\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534701 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-run-httpd\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534722 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-combined-ca-bundle\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534746 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-sg-core-conf-yaml\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534772 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-ceilometer-tls-certs\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534792 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-config-data\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.534848 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-log-httpd\") pod \"f428b93f-6490-4285-9d98-37601fe3bd86\" (UID: \"f428b93f-6490-4285-9d98-37601fe3bd86\") " Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.535851 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.535934 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.542891 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f428b93f-6490-4285-9d98-37601fe3bd86-kube-api-access-khzgf" (OuterVolumeSpecName: "kube-api-access-khzgf") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "kube-api-access-khzgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.554210 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-scripts" (OuterVolumeSpecName: "scripts") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.567208 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.604593 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.615530 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.637209 4666 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.637238 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.637247 4666 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.637255 4666 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.637265 4666 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f428b93f-6490-4285-9d98-37601fe3bd86-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.637274 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khzgf\" (UniqueName: \"kubernetes.io/projected/f428b93f-6490-4285-9d98-37601fe3bd86-kube-api-access-khzgf\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.637281 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.647457 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-config-data" (OuterVolumeSpecName: "config-data") pod "f428b93f-6490-4285-9d98-37601fe3bd86" (UID: "f428b93f-6490-4285-9d98-37601fe3bd86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:05 crc kubenswrapper[4666]: I1203 12:47:05.738856 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f428b93f-6490-4285-9d98-37601fe3bd86-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.374079 4666 generic.go:334] "Generic (PLEG): container finished" podID="d206ccec-3195-42c1-8c07-b27785183fd7" containerID="26efb468e7ccf5b0be81244d4e786ca27e41fda98463bb1e29cb129b83cb7464" exitCode=0 Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.374172 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f103-account-create-update-hscdk" event={"ID":"d206ccec-3195-42c1-8c07-b27785183fd7","Type":"ContainerDied","Data":"26efb468e7ccf5b0be81244d4e786ca27e41fda98463bb1e29cb129b83cb7464"} Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.378194 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.378460 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f428b93f-6490-4285-9d98-37601fe3bd86","Type":"ContainerDied","Data":"15458b044af39df42c1708ebd5265df62e8888a36180ecd13c9691763e6f859d"} Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.378540 4666 scope.go:117] "RemoveContainer" containerID="0bacdd97ee7b4a75826c538e856cd90edcc50b5fc0b0c7715f0e4a541bed09ca" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.430206 4666 scope.go:117] "RemoveContainer" containerID="d19a4faaa24af983517304a98f56a8addb84d83e79a8a4a46b6ba596180e23b5" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.440775 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.468143 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.479572 4666 scope.go:117] "RemoveContainer" containerID="24637595225f333314300e7f866c03d7b1d865da514d7dd5be9a19c614cd1c9f" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.556374 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:06 crc kubenswrapper[4666]: E1203 12:47:06.571822 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="proxy-httpd" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.571870 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="proxy-httpd" Dec 03 12:47:06 crc kubenswrapper[4666]: E1203 12:47:06.571892 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="sg-core" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.571901 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="sg-core" Dec 03 12:47:06 crc kubenswrapper[4666]: E1203 12:47:06.571933 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-central-agent" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.571943 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-central-agent" Dec 03 12:47:06 crc kubenswrapper[4666]: E1203 12:47:06.571960 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-notification-agent" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.571969 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-notification-agent" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.572506 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="sg-core" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.572538 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="proxy-httpd" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.572554 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-central-agent" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.572578 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" containerName="ceilometer-notification-agent" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.578222 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.578521 4666 scope.go:117] "RemoveContainer" containerID="e2f68816cce404140490c567e04e46078cbe6103eea95ca2fabeb77ef52abfc8" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.583703 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.583863 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.586775 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.587551 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.660701 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.660742 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-config-data\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.660802 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.660834 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-log-httpd\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.660852 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9l8\" (UniqueName: \"kubernetes.io/projected/02394439-9892-4280-8b01-7978c9fbcc92-kube-api-access-hz9l8\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.660882 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-scripts\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.661042 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-run-httpd\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.661059 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.710643 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762723 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-run-httpd\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762761 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762847 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762864 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-config-data\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762885 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762903 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-log-httpd\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762918 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9l8\" (UniqueName: \"kubernetes.io/projected/02394439-9892-4280-8b01-7978c9fbcc92-kube-api-access-hz9l8\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.762936 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-scripts\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.765437 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-run-httpd\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.769410 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-scripts\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.769587 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.769758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.770328 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-log-httpd\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.770327 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-config-data\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.770619 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.786849 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9l8\" (UniqueName: \"kubernetes.io/projected/02394439-9892-4280-8b01-7978c9fbcc92-kube-api-access-hz9l8\") pod \"ceilometer-0\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.875767 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.917154 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.968416 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkgpp\" (UniqueName: \"kubernetes.io/projected/f374154c-83df-4685-8224-aa067097648d-kube-api-access-xkgpp\") pod \"f374154c-83df-4685-8224-aa067097648d\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.968857 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f374154c-83df-4685-8224-aa067097648d-operator-scripts\") pod \"f374154c-83df-4685-8224-aa067097648d\" (UID: \"f374154c-83df-4685-8224-aa067097648d\") " Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.969822 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f374154c-83df-4685-8224-aa067097648d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f374154c-83df-4685-8224-aa067097648d" (UID: "f374154c-83df-4685-8224-aa067097648d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:06 crc kubenswrapper[4666]: I1203 12:47:06.972593 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f374154c-83df-4685-8224-aa067097648d-kube-api-access-xkgpp" (OuterVolumeSpecName: "kube-api-access-xkgpp") pod "f374154c-83df-4685-8224-aa067097648d" (UID: "f374154c-83df-4685-8224-aa067097648d"). InnerVolumeSpecName "kube-api-access-xkgpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.064862 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.070357 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkgpp\" (UniqueName: \"kubernetes.io/projected/f374154c-83df-4685-8224-aa067097648d-kube-api-access-xkgpp\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.070378 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f374154c-83df-4685-8224-aa067097648d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.074143 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.085377 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.093518 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172274 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tcwf\" (UniqueName: \"kubernetes.io/projected/502dbfff-6c76-448e-aff9-db535351f22f-kube-api-access-9tcwf\") pod \"502dbfff-6c76-448e-aff9-db535351f22f\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172410 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54662f34-5ced-49d9-bfba-ddccae72099e-operator-scripts\") pod \"54662f34-5ced-49d9-bfba-ddccae72099e\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172443 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98004b4-42cb-4eca-9da3-c440aa955f18-operator-scripts\") pod \"b98004b4-42cb-4eca-9da3-c440aa955f18\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172521 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx5xf\" (UniqueName: \"kubernetes.io/projected/54662f34-5ced-49d9-bfba-ddccae72099e-kube-api-access-gx5xf\") pod \"54662f34-5ced-49d9-bfba-ddccae72099e\" (UID: \"54662f34-5ced-49d9-bfba-ddccae72099e\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172591 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/502dbfff-6c76-448e-aff9-db535351f22f-operator-scripts\") pod \"502dbfff-6c76-448e-aff9-db535351f22f\" (UID: \"502dbfff-6c76-448e-aff9-db535351f22f\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172693 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2h2r\" (UniqueName: \"kubernetes.io/projected/baaeae1d-3d78-477a-a46d-80ee1c6447b1-kube-api-access-t2h2r\") pod \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172766 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdxwb\" (UniqueName: \"kubernetes.io/projected/b98004b4-42cb-4eca-9da3-c440aa955f18-kube-api-access-pdxwb\") pod \"b98004b4-42cb-4eca-9da3-c440aa955f18\" (UID: \"b98004b4-42cb-4eca-9da3-c440aa955f18\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.172824 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baaeae1d-3d78-477a-a46d-80ee1c6447b1-operator-scripts\") pod \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\" (UID: \"baaeae1d-3d78-477a-a46d-80ee1c6447b1\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.174312 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baaeae1d-3d78-477a-a46d-80ee1c6447b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "baaeae1d-3d78-477a-a46d-80ee1c6447b1" (UID: "baaeae1d-3d78-477a-a46d-80ee1c6447b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.174771 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/502dbfff-6c76-448e-aff9-db535351f22f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "502dbfff-6c76-448e-aff9-db535351f22f" (UID: "502dbfff-6c76-448e-aff9-db535351f22f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.177015 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98004b4-42cb-4eca-9da3-c440aa955f18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b98004b4-42cb-4eca-9da3-c440aa955f18" (UID: "b98004b4-42cb-4eca-9da3-c440aa955f18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.177761 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54662f34-5ced-49d9-bfba-ddccae72099e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54662f34-5ced-49d9-bfba-ddccae72099e" (UID: "54662f34-5ced-49d9-bfba-ddccae72099e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.177863 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502dbfff-6c76-448e-aff9-db535351f22f-kube-api-access-9tcwf" (OuterVolumeSpecName: "kube-api-access-9tcwf") pod "502dbfff-6c76-448e-aff9-db535351f22f" (UID: "502dbfff-6c76-448e-aff9-db535351f22f"). InnerVolumeSpecName "kube-api-access-9tcwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.178718 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54662f34-5ced-49d9-bfba-ddccae72099e-kube-api-access-gx5xf" (OuterVolumeSpecName: "kube-api-access-gx5xf") pod "54662f34-5ced-49d9-bfba-ddccae72099e" (UID: "54662f34-5ced-49d9-bfba-ddccae72099e"). InnerVolumeSpecName "kube-api-access-gx5xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.179471 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baaeae1d-3d78-477a-a46d-80ee1c6447b1-kube-api-access-t2h2r" (OuterVolumeSpecName: "kube-api-access-t2h2r") pod "baaeae1d-3d78-477a-a46d-80ee1c6447b1" (UID: "baaeae1d-3d78-477a-a46d-80ee1c6447b1"). InnerVolumeSpecName "kube-api-access-t2h2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.179884 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98004b4-42cb-4eca-9da3-c440aa955f18-kube-api-access-pdxwb" (OuterVolumeSpecName: "kube-api-access-pdxwb") pod "b98004b4-42cb-4eca-9da3-c440aa955f18" (UID: "b98004b4-42cb-4eca-9da3-c440aa955f18"). InnerVolumeSpecName "kube-api-access-pdxwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275413 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tcwf\" (UniqueName: \"kubernetes.io/projected/502dbfff-6c76-448e-aff9-db535351f22f-kube-api-access-9tcwf\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275529 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54662f34-5ced-49d9-bfba-ddccae72099e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275539 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98004b4-42cb-4eca-9da3-c440aa955f18-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275548 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx5xf\" (UniqueName: \"kubernetes.io/projected/54662f34-5ced-49d9-bfba-ddccae72099e-kube-api-access-gx5xf\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275559 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/502dbfff-6c76-448e-aff9-db535351f22f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275570 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2h2r\" (UniqueName: \"kubernetes.io/projected/baaeae1d-3d78-477a-a46d-80ee1c6447b1-kube-api-access-t2h2r\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275580 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdxwb\" (UniqueName: \"kubernetes.io/projected/b98004b4-42cb-4eca-9da3-c440aa955f18-kube-api-access-pdxwb\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.275590 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baaeae1d-3d78-477a-a46d-80ee1c6447b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.387838 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b618-account-create-update-2c7dc" event={"ID":"b98004b4-42cb-4eca-9da3-c440aa955f18","Type":"ContainerDied","Data":"38b8fd62030824b6a7bc663093b0f8025e33f15dfe2c2581d2a68eeea5bd9b99"} Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.387880 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38b8fd62030824b6a7bc663093b0f8025e33f15dfe2c2581d2a68eeea5bd9b99" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.387956 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b618-account-create-update-2c7dc" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.402355 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7596k" event={"ID":"f374154c-83df-4685-8224-aa067097648d","Type":"ContainerDied","Data":"6d7efb7044671500ff9eee8cfc39e9eb723c65b151b36421a68993b8307a1490"} Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.402925 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d7efb7044671500ff9eee8cfc39e9eb723c65b151b36421a68993b8307a1490" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.402781 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7596k" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.407108 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" event={"ID":"baaeae1d-3d78-477a-a46d-80ee1c6447b1","Type":"ContainerDied","Data":"af8f55801e5cbab6413dbec915e1bb7724ed133259cb8041653db1cb926a73ac"} Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.407153 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af8f55801e5cbab6413dbec915e1bb7724ed133259cb8041653db1cb926a73ac" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.407198 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3a3c-account-create-update-4qd86" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.414389 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4f48w" event={"ID":"54662f34-5ced-49d9-bfba-ddccae72099e","Type":"ContainerDied","Data":"24e845e2a4be8a494e52eace559cdcb8941de7a851d1757b1f0ee48ac7b10229"} Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.414437 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e845e2a4be8a494e52eace559cdcb8941de7a851d1757b1f0ee48ac7b10229" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.414506 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4f48w" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.417925 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.424822 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-26qkc" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.458938 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f428b93f-6490-4285-9d98-37601fe3bd86" path="/var/lib/kubelet/pods/f428b93f-6490-4285-9d98-37601fe3bd86/volumes" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.461958 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-26qkc" event={"ID":"502dbfff-6c76-448e-aff9-db535351f22f","Type":"ContainerDied","Data":"b5f87ebfb436bf77b63c16e27e61f90621bdc2223c025f5d2050f3a408902a34"} Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.462028 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f87ebfb436bf77b63c16e27e61f90621bdc2223c025f5d2050f3a408902a34" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.659400 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.784361 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrqfj\" (UniqueName: \"kubernetes.io/projected/d206ccec-3195-42c1-8c07-b27785183fd7-kube-api-access-hrqfj\") pod \"d206ccec-3195-42c1-8c07-b27785183fd7\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.784541 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206ccec-3195-42c1-8c07-b27785183fd7-operator-scripts\") pod \"d206ccec-3195-42c1-8c07-b27785183fd7\" (UID: \"d206ccec-3195-42c1-8c07-b27785183fd7\") " Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.789987 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d206ccec-3195-42c1-8c07-b27785183fd7-kube-api-access-hrqfj" (OuterVolumeSpecName: "kube-api-access-hrqfj") pod "d206ccec-3195-42c1-8c07-b27785183fd7" (UID: "d206ccec-3195-42c1-8c07-b27785183fd7"). InnerVolumeSpecName "kube-api-access-hrqfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.873662 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d206ccec-3195-42c1-8c07-b27785183fd7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d206ccec-3195-42c1-8c07-b27785183fd7" (UID: "d206ccec-3195-42c1-8c07-b27785183fd7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.886629 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrqfj\" (UniqueName: \"kubernetes.io/projected/d206ccec-3195-42c1-8c07-b27785183fd7-kube-api-access-hrqfj\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:07 crc kubenswrapper[4666]: I1203 12:47:07.886667 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206ccec-3195-42c1-8c07-b27785183fd7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.432874 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerStarted","Data":"fbca30b5b6a7787e34af08460490d8dce497a0ec2fa772fcf7e1c1502a0671b7"} Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.433519 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerStarted","Data":"82ac8166ed1609984d4a5958d37edc70d8f963c7848c23434cc55701e5492d02"} Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.434364 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f103-account-create-update-hscdk" event={"ID":"d206ccec-3195-42c1-8c07-b27785183fd7","Type":"ContainerDied","Data":"1605e4c6e22f3ff61f31705f43b1f0c486d43c2f19f4c6935af4de365aa0baa0"} Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.434388 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1605e4c6e22f3ff61f31705f43b1f0c486d43c2f19f4c6935af4de365aa0baa0" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.434429 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f103-account-create-update-hscdk" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.687313 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.687361 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.737535 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.860587 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qr2ph"] Dec 03 12:47:08 crc kubenswrapper[4666]: E1203 12:47:08.861720 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d206ccec-3195-42c1-8c07-b27785183fd7" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.861742 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d206ccec-3195-42c1-8c07-b27785183fd7" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: E1203 12:47:08.861782 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54662f34-5ced-49d9-bfba-ddccae72099e" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.861790 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="54662f34-5ced-49d9-bfba-ddccae72099e" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: E1203 12:47:08.861815 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f374154c-83df-4685-8224-aa067097648d" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.861825 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f374154c-83df-4685-8224-aa067097648d" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: E1203 12:47:08.861852 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baaeae1d-3d78-477a-a46d-80ee1c6447b1" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.861862 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="baaeae1d-3d78-477a-a46d-80ee1c6447b1" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: E1203 12:47:08.861886 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98004b4-42cb-4eca-9da3-c440aa955f18" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.861895 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98004b4-42cb-4eca-9da3-c440aa955f18" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: E1203 12:47:08.861917 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502dbfff-6c76-448e-aff9-db535351f22f" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.861926 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="502dbfff-6c76-448e-aff9-db535351f22f" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.862427 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="502dbfff-6c76-448e-aff9-db535351f22f" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.862453 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f374154c-83df-4685-8224-aa067097648d" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.862484 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="baaeae1d-3d78-477a-a46d-80ee1c6447b1" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.862494 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b98004b4-42cb-4eca-9da3-c440aa955f18" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.862508 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="54662f34-5ced-49d9-bfba-ddccae72099e" containerName="mariadb-database-create" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.862536 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d206ccec-3195-42c1-8c07-b27785183fd7" containerName="mariadb-account-create-update" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.863581 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.867215 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.867527 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.867832 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-56296" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.889749 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qr2ph"] Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.901328 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-config-data\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.901417 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9mf6\" (UniqueName: \"kubernetes.io/projected/e7912a6f-d432-4e4c-8c31-0023deac5557-kube-api-access-m9mf6\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.901479 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:08 crc kubenswrapper[4666]: I1203 12:47:08.901570 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-scripts\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.002921 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-scripts\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.003288 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-config-data\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.003343 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9mf6\" (UniqueName: \"kubernetes.io/projected/e7912a6f-d432-4e4c-8c31-0023deac5557-kube-api-access-m9mf6\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.003385 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.007958 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-scripts\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.008502 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.011040 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-config-data\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.023760 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9mf6\" (UniqueName: \"kubernetes.io/projected/e7912a6f-d432-4e4c-8c31-0023deac5557-kube-api-access-m9mf6\") pod \"nova-cell0-conductor-db-sync-qr2ph\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.183402 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.506641 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.555845 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8b8q"] Dec 03 12:47:09 crc kubenswrapper[4666]: I1203 12:47:09.616547 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qr2ph"] Dec 03 12:47:09 crc kubenswrapper[4666]: W1203 12:47:09.619379 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7912a6f_d432_4e4c_8c31_0023deac5557.slice/crio-a54b8da1ab4bc5f6f70ebaeacccd85fb4ebe3054adbaa54b51337932022b1b20 WatchSource:0}: Error finding container a54b8da1ab4bc5f6f70ebaeacccd85fb4ebe3054adbaa54b51337932022b1b20: Status 404 returned error can't find the container with id a54b8da1ab4bc5f6f70ebaeacccd85fb4ebe3054adbaa54b51337932022b1b20 Dec 03 12:47:10 crc kubenswrapper[4666]: I1203 12:47:10.463958 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerStarted","Data":"d5060b377f5b92b7e1907af2ba0f651876fa4b5abc23d97d7b37e1da54c8bb59"} Dec 03 12:47:10 crc kubenswrapper[4666]: I1203 12:47:10.464348 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerStarted","Data":"4a32b3eef5e5cf302cf1af34cf45ce2192a5dbf5978055a52714a02fc8664055"} Dec 03 12:47:10 crc kubenswrapper[4666]: I1203 12:47:10.465312 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" event={"ID":"e7912a6f-d432-4e4c-8c31-0023deac5557","Type":"ContainerStarted","Data":"a54b8da1ab4bc5f6f70ebaeacccd85fb4ebe3054adbaa54b51337932022b1b20"} Dec 03 12:47:11 crc kubenswrapper[4666]: I1203 12:47:11.479648 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8b8q" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="registry-server" containerID="cri-o://0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf" gracePeriod=2 Dec 03 12:47:11 crc kubenswrapper[4666]: E1203 12:47:11.656588 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e18c2f_10f5_4c7b_90d6_b4232e1eee13.slice/crio-conmon-0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e18c2f_10f5_4c7b_90d6_b4232e1eee13.slice/crio-0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:47:11 crc kubenswrapper[4666]: I1203 12:47:11.927970 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.074273 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfmpn\" (UniqueName: \"kubernetes.io/projected/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-kube-api-access-sfmpn\") pod \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.074501 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-utilities\") pod \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.075217 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-catalog-content\") pod \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\" (UID: \"12e18c2f-10f5-4c7b-90d6-b4232e1eee13\") " Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.075835 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-utilities" (OuterVolumeSpecName: "utilities") pod "12e18c2f-10f5-4c7b-90d6-b4232e1eee13" (UID: "12e18c2f-10f5-4c7b-90d6-b4232e1eee13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.076007 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.079005 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-kube-api-access-sfmpn" (OuterVolumeSpecName: "kube-api-access-sfmpn") pod "12e18c2f-10f5-4c7b-90d6-b4232e1eee13" (UID: "12e18c2f-10f5-4c7b-90d6-b4232e1eee13"). InnerVolumeSpecName "kube-api-access-sfmpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.091881 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12e18c2f-10f5-4c7b-90d6-b4232e1eee13" (UID: "12e18c2f-10f5-4c7b-90d6-b4232e1eee13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.178034 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.178072 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfmpn\" (UniqueName: \"kubernetes.io/projected/12e18c2f-10f5-4c7b-90d6-b4232e1eee13-kube-api-access-sfmpn\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.492860 4666 generic.go:334] "Generic (PLEG): container finished" podID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerID="0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf" exitCode=0 Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.492974 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8b8q" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.492955 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8b8q" event={"ID":"12e18c2f-10f5-4c7b-90d6-b4232e1eee13","Type":"ContainerDied","Data":"0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf"} Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.493004 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8b8q" event={"ID":"12e18c2f-10f5-4c7b-90d6-b4232e1eee13","Type":"ContainerDied","Data":"c65c2d51366ef7cc996ea24bb860ae8c12d64210e966ec135bfbf6cf918aa88f"} Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.493021 4666 scope.go:117] "RemoveContainer" containerID="0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.501294 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerStarted","Data":"54767a9d063d064e937cc597b6f105aba385650041bc1c60c0112c5b0ee1906c"} Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.501583 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.533675 4666 scope.go:117] "RemoveContainer" containerID="80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.540327 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.408199791 podStartE2EDuration="6.540305641s" podCreationTimestamp="2025-12-03 12:47:06 +0000 UTC" firstStartedPulling="2025-12-03 12:47:07.445381203 +0000 UTC m=+2016.290342244" lastFinishedPulling="2025-12-03 12:47:11.577487043 +0000 UTC m=+2020.422448094" observedRunningTime="2025-12-03 12:47:12.522230883 +0000 UTC m=+2021.367191964" watchObservedRunningTime="2025-12-03 12:47:12.540305641 +0000 UTC m=+2021.385266702" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.554996 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8b8q"] Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.564939 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8b8q"] Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.574955 4666 scope.go:117] "RemoveContainer" containerID="5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.601325 4666 scope.go:117] "RemoveContainer" containerID="0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf" Dec 03 12:47:12 crc kubenswrapper[4666]: E1203 12:47:12.602013 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf\": container with ID starting with 0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf not found: ID does not exist" containerID="0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.602172 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf"} err="failed to get container status \"0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf\": rpc error: code = NotFound desc = could not find container \"0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf\": container with ID starting with 0c93a2da92ed2e1e33a67eefc030d6e5d3ef0b92180f061ff05b83139f91ebbf not found: ID does not exist" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.602291 4666 scope.go:117] "RemoveContainer" containerID="80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8" Dec 03 12:47:12 crc kubenswrapper[4666]: E1203 12:47:12.602957 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8\": container with ID starting with 80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8 not found: ID does not exist" containerID="80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.602997 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8"} err="failed to get container status \"80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8\": rpc error: code = NotFound desc = could not find container \"80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8\": container with ID starting with 80763a8912bcb25ea02631f8f3a5887fb2bc7ec556bb0296dca2168b8e0e97e8 not found: ID does not exist" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.603022 4666 scope.go:117] "RemoveContainer" containerID="5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866" Dec 03 12:47:12 crc kubenswrapper[4666]: E1203 12:47:12.603428 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866\": container with ID starting with 5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866 not found: ID does not exist" containerID="5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866" Dec 03 12:47:12 crc kubenswrapper[4666]: I1203 12:47:12.603450 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866"} err="failed to get container status \"5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866\": rpc error: code = NotFound desc = could not find container \"5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866\": container with ID starting with 5515ae34d57f84dba29bbbb62946bb1824b18faca130d61f9f64b5857afd3866 not found: ID does not exist" Dec 03 12:47:13 crc kubenswrapper[4666]: I1203 12:47:13.436595 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" path="/var/lib/kubelet/pods/12e18c2f-10f5-4c7b-90d6-b4232e1eee13/volumes" Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.124765 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.125483 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-central-agent" containerID="cri-o://fbca30b5b6a7787e34af08460490d8dce497a0ec2fa772fcf7e1c1502a0671b7" gracePeriod=30 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.125507 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="sg-core" containerID="cri-o://d5060b377f5b92b7e1907af2ba0f651876fa4b5abc23d97d7b37e1da54c8bb59" gracePeriod=30 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.125560 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="proxy-httpd" containerID="cri-o://54767a9d063d064e937cc597b6f105aba385650041bc1c60c0112c5b0ee1906c" gracePeriod=30 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.125530 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-notification-agent" containerID="cri-o://4a32b3eef5e5cf302cf1af34cf45ce2192a5dbf5978055a52714a02fc8664055" gracePeriod=30 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.570828 4666 generic.go:334] "Generic (PLEG): container finished" podID="02394439-9892-4280-8b01-7978c9fbcc92" containerID="54767a9d063d064e937cc597b6f105aba385650041bc1c60c0112c5b0ee1906c" exitCode=0 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.571173 4666 generic.go:334] "Generic (PLEG): container finished" podID="02394439-9892-4280-8b01-7978c9fbcc92" containerID="d5060b377f5b92b7e1907af2ba0f651876fa4b5abc23d97d7b37e1da54c8bb59" exitCode=2 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.571182 4666 generic.go:334] "Generic (PLEG): container finished" podID="02394439-9892-4280-8b01-7978c9fbcc92" containerID="4a32b3eef5e5cf302cf1af34cf45ce2192a5dbf5978055a52714a02fc8664055" exitCode=0 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.571188 4666 generic.go:334] "Generic (PLEG): container finished" podID="02394439-9892-4280-8b01-7978c9fbcc92" containerID="fbca30b5b6a7787e34af08460490d8dce497a0ec2fa772fcf7e1c1502a0671b7" exitCode=0 Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.571208 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerDied","Data":"54767a9d063d064e937cc597b6f105aba385650041bc1c60c0112c5b0ee1906c"} Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.571249 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerDied","Data":"d5060b377f5b92b7e1907af2ba0f651876fa4b5abc23d97d7b37e1da54c8bb59"} Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.571259 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerDied","Data":"4a32b3eef5e5cf302cf1af34cf45ce2192a5dbf5978055a52714a02fc8664055"} Dec 03 12:47:15 crc kubenswrapper[4666]: I1203 12:47:15.571268 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerDied","Data":"fbca30b5b6a7787e34af08460490d8dce497a0ec2fa772fcf7e1c1502a0671b7"} Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.121035 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201037 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-sg-core-conf-yaml\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201113 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-combined-ca-bundle\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201186 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz9l8\" (UniqueName: \"kubernetes.io/projected/02394439-9892-4280-8b01-7978c9fbcc92-kube-api-access-hz9l8\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201255 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-config-data\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201292 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-run-httpd\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201324 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-scripts\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201355 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-log-httpd\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.201387 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-ceilometer-tls-certs\") pod \"02394439-9892-4280-8b01-7978c9fbcc92\" (UID: \"02394439-9892-4280-8b01-7978c9fbcc92\") " Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.202454 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.203104 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.205288 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02394439-9892-4280-8b01-7978c9fbcc92-kube-api-access-hz9l8" (OuterVolumeSpecName: "kube-api-access-hz9l8") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "kube-api-access-hz9l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.205434 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-scripts" (OuterVolumeSpecName: "scripts") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.231410 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.245724 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.267074 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.303367 4666 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.303399 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.303412 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz9l8\" (UniqueName: \"kubernetes.io/projected/02394439-9892-4280-8b01-7978c9fbcc92-kube-api-access-hz9l8\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.303425 4666 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.303433 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.303441 4666 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02394439-9892-4280-8b01-7978c9fbcc92-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.303451 4666 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.311261 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-config-data" (OuterVolumeSpecName: "config-data") pod "02394439-9892-4280-8b01-7978c9fbcc92" (UID: "02394439-9892-4280-8b01-7978c9fbcc92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.405001 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02394439-9892-4280-8b01-7978c9fbcc92-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.598492 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" event={"ID":"e7912a6f-d432-4e4c-8c31-0023deac5557","Type":"ContainerStarted","Data":"1c8458fec2fe930bd0dd1aaba74ed6fde386701b6b50f247707095572e6f2be1"} Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.601918 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02394439-9892-4280-8b01-7978c9fbcc92","Type":"ContainerDied","Data":"82ac8166ed1609984d4a5958d37edc70d8f963c7848c23434cc55701e5492d02"} Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.601972 4666 scope.go:117] "RemoveContainer" containerID="54767a9d063d064e937cc597b6f105aba385650041bc1c60c0112c5b0ee1906c" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.602236 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.625109 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" podStartSLOduration=2.340154549 podStartE2EDuration="10.625070607s" podCreationTimestamp="2025-12-03 12:47:08 +0000 UTC" firstStartedPulling="2025-12-03 12:47:09.621367653 +0000 UTC m=+2018.466328704" lastFinishedPulling="2025-12-03 12:47:17.906283711 +0000 UTC m=+2026.751244762" observedRunningTime="2025-12-03 12:47:18.617102041 +0000 UTC m=+2027.462063112" watchObservedRunningTime="2025-12-03 12:47:18.625070607 +0000 UTC m=+2027.470031648" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.648038 4666 scope.go:117] "RemoveContainer" containerID="d5060b377f5b92b7e1907af2ba0f651876fa4b5abc23d97d7b37e1da54c8bb59" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.659901 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.676678 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.685609 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:18 crc kubenswrapper[4666]: E1203 12:47:18.686001 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="sg-core" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686024 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="sg-core" Dec 03 12:47:18 crc kubenswrapper[4666]: E1203 12:47:18.686045 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="extract-content" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686055 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="extract-content" Dec 03 12:47:18 crc kubenswrapper[4666]: E1203 12:47:18.686072 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="extract-utilities" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686124 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="extract-utilities" Dec 03 12:47:18 crc kubenswrapper[4666]: E1203 12:47:18.686151 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-notification-agent" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686160 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-notification-agent" Dec 03 12:47:18 crc kubenswrapper[4666]: E1203 12:47:18.686181 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="registry-server" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686189 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="registry-server" Dec 03 12:47:18 crc kubenswrapper[4666]: E1203 12:47:18.686200 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="proxy-httpd" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686207 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="proxy-httpd" Dec 03 12:47:18 crc kubenswrapper[4666]: E1203 12:47:18.686224 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-central-agent" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686232 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-central-agent" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686420 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-central-agent" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686437 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e18c2f-10f5-4c7b-90d6-b4232e1eee13" containerName="registry-server" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686448 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="proxy-httpd" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686469 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="sg-core" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.686482 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="02394439-9892-4280-8b01-7978c9fbcc92" containerName="ceilometer-notification-agent" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.688390 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.691281 4666 scope.go:117] "RemoveContainer" containerID="4a32b3eef5e5cf302cf1af34cf45ce2192a5dbf5978055a52714a02fc8664055" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.691551 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.691722 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.691913 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.704929 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.715296 4666 scope.go:117] "RemoveContainer" containerID="fbca30b5b6a7787e34af08460490d8dce497a0ec2fa772fcf7e1c1502a0671b7" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820170 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820228 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-run-httpd\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820332 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbd72\" (UniqueName: \"kubernetes.io/projected/26b800ef-0a58-47b1-ac57-ef8ce7b82285-kube-api-access-bbd72\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820354 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-log-httpd\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820422 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820473 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-config-data\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820500 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-scripts\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.820524 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922411 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922505 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-config-data\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922548 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-scripts\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922579 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922641 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922700 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-run-httpd\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922781 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbd72\" (UniqueName: \"kubernetes.io/projected/26b800ef-0a58-47b1-ac57-ef8ce7b82285-kube-api-access-bbd72\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.922862 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-log-httpd\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.923467 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-log-httpd\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.928288 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-config-data\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.928289 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-run-httpd\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.932448 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.932971 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.939861 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-scripts\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.941060 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:18 crc kubenswrapper[4666]: I1203 12:47:18.945736 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbd72\" (UniqueName: \"kubernetes.io/projected/26b800ef-0a58-47b1-ac57-ef8ce7b82285-kube-api-access-bbd72\") pod \"ceilometer-0\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " pod="openstack/ceilometer-0" Dec 03 12:47:19 crc kubenswrapper[4666]: I1203 12:47:19.014134 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:47:19 crc kubenswrapper[4666]: I1203 12:47:19.458482 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02394439-9892-4280-8b01-7978c9fbcc92" path="/var/lib/kubelet/pods/02394439-9892-4280-8b01-7978c9fbcc92/volumes" Dec 03 12:47:19 crc kubenswrapper[4666]: I1203 12:47:19.459354 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:47:19 crc kubenswrapper[4666]: I1203 12:47:19.612891 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerStarted","Data":"a38fbbfe870d0e30cb789245ef3a2d5355f91031978f74e822469a6f3793b7e1"} Dec 03 12:47:22 crc kubenswrapper[4666]: I1203 12:47:22.641025 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerStarted","Data":"42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74"} Dec 03 12:47:22 crc kubenswrapper[4666]: I1203 12:47:22.641545 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerStarted","Data":"9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36"} Dec 03 12:47:23 crc kubenswrapper[4666]: I1203 12:47:23.651301 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerStarted","Data":"c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc"} Dec 03 12:47:26 crc kubenswrapper[4666]: I1203 12:47:26.678853 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerStarted","Data":"c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c"} Dec 03 12:47:26 crc kubenswrapper[4666]: I1203 12:47:26.679439 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:47:26 crc kubenswrapper[4666]: I1203 12:47:26.700348 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.710574952 podStartE2EDuration="8.700327231s" podCreationTimestamp="2025-12-03 12:47:18 +0000 UTC" firstStartedPulling="2025-12-03 12:47:19.460839983 +0000 UTC m=+2028.305801024" lastFinishedPulling="2025-12-03 12:47:25.450592202 +0000 UTC m=+2034.295553303" observedRunningTime="2025-12-03 12:47:26.697794923 +0000 UTC m=+2035.542755994" watchObservedRunningTime="2025-12-03 12:47:26.700327231 +0000 UTC m=+2035.545288282" Dec 03 12:47:33 crc kubenswrapper[4666]: I1203 12:47:33.772064 4666 generic.go:334] "Generic (PLEG): container finished" podID="e7912a6f-d432-4e4c-8c31-0023deac5557" containerID="1c8458fec2fe930bd0dd1aaba74ed6fde386701b6b50f247707095572e6f2be1" exitCode=0 Dec 03 12:47:33 crc kubenswrapper[4666]: I1203 12:47:33.772156 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" event={"ID":"e7912a6f-d432-4e4c-8c31-0023deac5557","Type":"ContainerDied","Data":"1c8458fec2fe930bd0dd1aaba74ed6fde386701b6b50f247707095572e6f2be1"} Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.130546 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.248265 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-combined-ca-bundle\") pod \"e7912a6f-d432-4e4c-8c31-0023deac5557\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.248434 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-config-data\") pod \"e7912a6f-d432-4e4c-8c31-0023deac5557\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.248490 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-scripts\") pod \"e7912a6f-d432-4e4c-8c31-0023deac5557\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.248657 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9mf6\" (UniqueName: \"kubernetes.io/projected/e7912a6f-d432-4e4c-8c31-0023deac5557-kube-api-access-m9mf6\") pod \"e7912a6f-d432-4e4c-8c31-0023deac5557\" (UID: \"e7912a6f-d432-4e4c-8c31-0023deac5557\") " Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.255228 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-scripts" (OuterVolumeSpecName: "scripts") pod "e7912a6f-d432-4e4c-8c31-0023deac5557" (UID: "e7912a6f-d432-4e4c-8c31-0023deac5557"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.255297 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7912a6f-d432-4e4c-8c31-0023deac5557-kube-api-access-m9mf6" (OuterVolumeSpecName: "kube-api-access-m9mf6") pod "e7912a6f-d432-4e4c-8c31-0023deac5557" (UID: "e7912a6f-d432-4e4c-8c31-0023deac5557"). InnerVolumeSpecName "kube-api-access-m9mf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.284261 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-config-data" (OuterVolumeSpecName: "config-data") pod "e7912a6f-d432-4e4c-8c31-0023deac5557" (UID: "e7912a6f-d432-4e4c-8c31-0023deac5557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.287473 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7912a6f-d432-4e4c-8c31-0023deac5557" (UID: "e7912a6f-d432-4e4c-8c31-0023deac5557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.351017 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.351210 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.351307 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7912a6f-d432-4e4c-8c31-0023deac5557-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.351663 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9mf6\" (UniqueName: \"kubernetes.io/projected/e7912a6f-d432-4e4c-8c31-0023deac5557-kube-api-access-m9mf6\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.808872 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" event={"ID":"e7912a6f-d432-4e4c-8c31-0023deac5557","Type":"ContainerDied","Data":"a54b8da1ab4bc5f6f70ebaeacccd85fb4ebe3054adbaa54b51337932022b1b20"} Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.808927 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a54b8da1ab4bc5f6f70ebaeacccd85fb4ebe3054adbaa54b51337932022b1b20" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.808961 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qr2ph" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.890643 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 12:47:35 crc kubenswrapper[4666]: E1203 12:47:35.891709 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7912a6f-d432-4e4c-8c31-0023deac5557" containerName="nova-cell0-conductor-db-sync" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.891757 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7912a6f-d432-4e4c-8c31-0023deac5557" containerName="nova-cell0-conductor-db-sync" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.892185 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7912a6f-d432-4e4c-8c31-0023deac5557" containerName="nova-cell0-conductor-db-sync" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.893633 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.896496 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-56296" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.896592 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.901010 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.964506 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba61f3d-9288-44a6-b194-c136cc1bda0a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.964612 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba61f3d-9288-44a6-b194-c136cc1bda0a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:35 crc kubenswrapper[4666]: I1203 12:47:35.964749 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v55hs\" (UniqueName: \"kubernetes.io/projected/aba61f3d-9288-44a6-b194-c136cc1bda0a-kube-api-access-v55hs\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.067166 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba61f3d-9288-44a6-b194-c136cc1bda0a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.067577 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba61f3d-9288-44a6-b194-c136cc1bda0a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.067604 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v55hs\" (UniqueName: \"kubernetes.io/projected/aba61f3d-9288-44a6-b194-c136cc1bda0a-kube-api-access-v55hs\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.073024 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba61f3d-9288-44a6-b194-c136cc1bda0a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.083605 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v55hs\" (UniqueName: \"kubernetes.io/projected/aba61f3d-9288-44a6-b194-c136cc1bda0a-kube-api-access-v55hs\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.083803 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba61f3d-9288-44a6-b194-c136cc1bda0a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aba61f3d-9288-44a6-b194-c136cc1bda0a\") " pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.213559 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.731860 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 12:47:36 crc kubenswrapper[4666]: W1203 12:47:36.732362 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaba61f3d_9288_44a6_b194_c136cc1bda0a.slice/crio-a1d11a8e70435396b8217e4dd943815a8fdfc7fbec85dd24c9ccf999160bf0d0 WatchSource:0}: Error finding container a1d11a8e70435396b8217e4dd943815a8fdfc7fbec85dd24c9ccf999160bf0d0: Status 404 returned error can't find the container with id a1d11a8e70435396b8217e4dd943815a8fdfc7fbec85dd24c9ccf999160bf0d0 Dec 03 12:47:36 crc kubenswrapper[4666]: I1203 12:47:36.820391 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aba61f3d-9288-44a6-b194-c136cc1bda0a","Type":"ContainerStarted","Data":"a1d11a8e70435396b8217e4dd943815a8fdfc7fbec85dd24c9ccf999160bf0d0"} Dec 03 12:47:37 crc kubenswrapper[4666]: I1203 12:47:37.832542 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aba61f3d-9288-44a6-b194-c136cc1bda0a","Type":"ContainerStarted","Data":"4bff8191f0b94724877f4afb8b453d23abf7d1fe98dcb8b0e5fb23c58c8c966c"} Dec 03 12:47:37 crc kubenswrapper[4666]: I1203 12:47:37.833076 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:37 crc kubenswrapper[4666]: I1203 12:47:37.863320 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.8633020780000003 podStartE2EDuration="2.863302078s" podCreationTimestamp="2025-12-03 12:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:37.857180913 +0000 UTC m=+2046.702142004" watchObservedRunningTime="2025-12-03 12:47:37.863302078 +0000 UTC m=+2046.708263159" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.247211 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.709155 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5m5fc"] Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.710267 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.716783 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.717021 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.722597 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5m5fc"] Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.774514 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.774612 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-scripts\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.774646 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-config-data\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.774685 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljmq8\" (UniqueName: \"kubernetes.io/projected/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-kube-api-access-ljmq8\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.876080 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.876172 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-scripts\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.876198 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-config-data\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.876230 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljmq8\" (UniqueName: \"kubernetes.io/projected/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-kube-api-access-ljmq8\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.885709 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.885753 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-config-data\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.889927 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.892336 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.897227 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.900880 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-scripts\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.900906 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljmq8\" (UniqueName: \"kubernetes.io/projected/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-kube-api-access-ljmq8\") pod \"nova-cell0-cell-mapping-5m5fc\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.910846 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.954668 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.956057 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.957845 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.977788 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbrv\" (UniqueName: \"kubernetes.io/projected/ba6731ba-a089-4020-b248-ea92fe29aa86-kube-api-access-wqbrv\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.977855 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.977941 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-config-data\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:41 crc kubenswrapper[4666]: I1203 12:47:41.999645 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.028742 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.080260 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.082789 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-logs\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.082870 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbrv\" (UniqueName: \"kubernetes.io/projected/ba6731ba-a089-4020-b248-ea92fe29aa86-kube-api-access-wqbrv\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.082945 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.082977 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.083004 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-config-data\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.083038 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skwdf\" (UniqueName: \"kubernetes.io/projected/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-kube-api-access-skwdf\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.083080 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-config-data\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.084870 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.090467 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.096129 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-config-data\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.100974 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.102521 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.116211 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbrv\" (UniqueName: \"kubernetes.io/projected/ba6731ba-a089-4020-b248-ea92fe29aa86-kube-api-access-wqbrv\") pod \"nova-scheduler-0\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.131220 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.133200 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.133327 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.140531 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.184327 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-c9jvt"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.186937 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.191679 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.191799 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.191849 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7589d785-1f6d-46cb-93e3-61e2bd28de2a-logs\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.191916 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-logs\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.191962 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.192012 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p42b\" (UniqueName: \"kubernetes.io/projected/4dddd9cb-a490-4862-b1c5-3f581891ba57-kube-api-access-7p42b\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.192034 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.192072 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-config-data\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.192118 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skwdf\" (UniqueName: \"kubernetes.io/projected/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-kube-api-access-skwdf\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.192306 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7ml\" (UniqueName: \"kubernetes.io/projected/7589d785-1f6d-46cb-93e3-61e2bd28de2a-kube-api-access-bn7ml\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.192331 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-config-data\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.192753 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-logs\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.202803 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-config-data\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.208510 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.226132 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-c9jvt"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.226442 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skwdf\" (UniqueName: \"kubernetes.io/projected/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-kube-api-access-skwdf\") pod \"nova-api-0\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.281068 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.294889 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.294956 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.295012 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p42b\" (UniqueName: \"kubernetes.io/projected/4dddd9cb-a490-4862-b1c5-3f581891ba57-kube-api-access-7p42b\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.295070 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-config\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.300645 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7ml\" (UniqueName: \"kubernetes.io/projected/7589d785-1f6d-46cb-93e3-61e2bd28de2a-kube-api-access-bn7ml\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.300696 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-config-data\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.300736 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.300801 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.300821 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.300838 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4b29\" (UniqueName: \"kubernetes.io/projected/ab129ab9-27f4-43c5-aca6-397236fb03c1-kube-api-access-c4b29\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.301073 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.301145 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7589d785-1f6d-46cb-93e3-61e2bd28de2a-logs\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.301345 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.302113 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.302903 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7589d785-1f6d-46cb-93e3-61e2bd28de2a-logs\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.305809 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-config-data\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.308654 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.308702 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.313462 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p42b\" (UniqueName: \"kubernetes.io/projected/4dddd9cb-a490-4862-b1c5-3f581891ba57-kube-api-access-7p42b\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.316526 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7ml\" (UniqueName: \"kubernetes.io/projected/7589d785-1f6d-46cb-93e3-61e2bd28de2a-kube-api-access-bn7ml\") pod \"nova-metadata-0\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.404874 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.405167 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.405188 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4b29\" (UniqueName: \"kubernetes.io/projected/ab129ab9-27f4-43c5-aca6-397236fb03c1-kube-api-access-c4b29\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.405301 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.405383 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-config\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.405790 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.406561 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.407494 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.413650 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-config\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.433386 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4b29\" (UniqueName: \"kubernetes.io/projected/ab129ab9-27f4-43c5-aca6-397236fb03c1-kube-api-access-c4b29\") pod \"dnsmasq-dns-566b5b7845-c9jvt\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.521315 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.564089 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.572014 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.614662 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5m5fc"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.651913 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.740781 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sts7r"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.741937 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.747693 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.747850 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.759643 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sts7r"] Dec 03 12:47:42 crc kubenswrapper[4666]: W1203 12:47:42.770816 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba6731ba_a089_4020_b248_ea92fe29aa86.slice/crio-86ba7109a13e4f2d49430d6aae4f03c65db2806f06173f623d2c4add07d49f12 WatchSource:0}: Error finding container 86ba7109a13e4f2d49430d6aae4f03c65db2806f06173f623d2c4add07d49f12: Status 404 returned error can't find the container with id 86ba7109a13e4f2d49430d6aae4f03c65db2806f06173f623d2c4add07d49f12 Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.786672 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.812287 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.812336 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv275\" (UniqueName: \"kubernetes.io/projected/1b702d3b-190d-48dc-8ee0-531b9d6f712b-kube-api-access-sv275\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.812633 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-config-data\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.812724 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-scripts\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.892166 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"11dc853c-6a55-4aeb-aba5-0c55faebf8ff","Type":"ContainerStarted","Data":"623449d7b2140dd653d3721b1372a30926e9fc3bf4c7dfbb545cdc4deaf04e4c"} Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.894799 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5m5fc" event={"ID":"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7","Type":"ContainerStarted","Data":"9533329919b1f410f230b910f24c86cdf34123fbc0e19964f7fb4811273a3ad7"} Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.896385 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba6731ba-a089-4020-b248-ea92fe29aa86","Type":"ContainerStarted","Data":"86ba7109a13e4f2d49430d6aae4f03c65db2806f06173f623d2c4add07d49f12"} Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.914435 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-config-data\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.914509 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-scripts\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.914565 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.914596 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv275\" (UniqueName: \"kubernetes.io/projected/1b702d3b-190d-48dc-8ee0-531b9d6f712b-kube-api-access-sv275\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.919941 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-scripts\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.920493 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-config-data\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.921811 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:42 crc kubenswrapper[4666]: I1203 12:47:42.938883 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv275\" (UniqueName: \"kubernetes.io/projected/1b702d3b-190d-48dc-8ee0-531b9d6f712b-kube-api-access-sv275\") pod \"nova-cell1-conductor-db-sync-sts7r\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.041085 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:47:43 crc kubenswrapper[4666]: W1203 12:47:43.050440 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7589d785_1f6d_46cb_93e3_61e2bd28de2a.slice/crio-eb963bdd30a0e7c8a50bc5d94f6c636a3ae3e772e2eda0f38f311119d0353d07 WatchSource:0}: Error finding container eb963bdd30a0e7c8a50bc5d94f6c636a3ae3e772e2eda0f38f311119d0353d07: Status 404 returned error can't find the container with id eb963bdd30a0e7c8a50bc5d94f6c636a3ae3e772e2eda0f38f311119d0353d07 Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.071336 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.174203 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.175876 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-c9jvt"] Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.572230 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sts7r"] Dec 03 12:47:43 crc kubenswrapper[4666]: W1203 12:47:43.579602 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b702d3b_190d_48dc_8ee0_531b9d6f712b.slice/crio-f6999a3dfb70d47887156f708f65c61db237443f24abb2738417659fca5f46e3 WatchSource:0}: Error finding container f6999a3dfb70d47887156f708f65c61db237443f24abb2738417659fca5f46e3: Status 404 returned error can't find the container with id f6999a3dfb70d47887156f708f65c61db237443f24abb2738417659fca5f46e3 Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.914835 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5m5fc" event={"ID":"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7","Type":"ContainerStarted","Data":"33fdfc004fbd6f16adf499e9c395bf05d28f9b38aa0ca16dfbee30e93a2ecc11"} Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.919038 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sts7r" event={"ID":"1b702d3b-190d-48dc-8ee0-531b9d6f712b","Type":"ContainerStarted","Data":"2ac794b5f11600be905c18b71906ddabf9c548c4f251843c9d8d1e93b2b70987"} Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.919112 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sts7r" event={"ID":"1b702d3b-190d-48dc-8ee0-531b9d6f712b","Type":"ContainerStarted","Data":"f6999a3dfb70d47887156f708f65c61db237443f24abb2738417659fca5f46e3"} Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.921879 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dddd9cb-a490-4862-b1c5-3f581891ba57","Type":"ContainerStarted","Data":"62e5793691cd0b6bf40f1dbe6f8790632c943416ef09d17405498155faf5f06e"} Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.925833 4666 generic.go:334] "Generic (PLEG): container finished" podID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerID="446093b16b60891f0f489fc174a7115f00bdb51520a56fc9f7d6622581f67813" exitCode=0 Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.925925 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" event={"ID":"ab129ab9-27f4-43c5-aca6-397236fb03c1","Type":"ContainerDied","Data":"446093b16b60891f0f489fc174a7115f00bdb51520a56fc9f7d6622581f67813"} Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.925962 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" event={"ID":"ab129ab9-27f4-43c5-aca6-397236fb03c1","Type":"ContainerStarted","Data":"0a793764d03de8e2647aea7963c4111b72379405645c43984f28a46469d700de"} Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.929457 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7589d785-1f6d-46cb-93e3-61e2bd28de2a","Type":"ContainerStarted","Data":"eb963bdd30a0e7c8a50bc5d94f6c636a3ae3e772e2eda0f38f311119d0353d07"} Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.973455 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-sts7r" podStartSLOduration=1.9734327980000002 podStartE2EDuration="1.973432798s" podCreationTimestamp="2025-12-03 12:47:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:43.964502647 +0000 UTC m=+2052.809463728" watchObservedRunningTime="2025-12-03 12:47:43.973432798 +0000 UTC m=+2052.818393849" Dec 03 12:47:43 crc kubenswrapper[4666]: I1203 12:47:43.977009 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5m5fc" podStartSLOduration=2.976986294 podStartE2EDuration="2.976986294s" podCreationTimestamp="2025-12-03 12:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:43.945547325 +0000 UTC m=+2052.790508376" watchObservedRunningTime="2025-12-03 12:47:43.976986294 +0000 UTC m=+2052.821947355" Dec 03 12:47:45 crc kubenswrapper[4666]: I1203 12:47:45.623904 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:47:45 crc kubenswrapper[4666]: I1203 12:47:45.632965 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:47:46 crc kubenswrapper[4666]: I1203 12:47:46.956511 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba6731ba-a089-4020-b248-ea92fe29aa86","Type":"ContainerStarted","Data":"8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483"} Dec 03 12:47:46 crc kubenswrapper[4666]: I1203 12:47:46.969034 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dddd9cb-a490-4862-b1c5-3f581891ba57","Type":"ContainerStarted","Data":"dcf2ec72c6f2206913c9e25a1eaf0ee89abfddf1153b8faaa1e445ba73cb9ca9"} Dec 03 12:47:46 crc kubenswrapper[4666]: I1203 12:47:46.969310 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4dddd9cb-a490-4862-b1c5-3f581891ba57" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dcf2ec72c6f2206913c9e25a1eaf0ee89abfddf1153b8faaa1e445ba73cb9ca9" gracePeriod=30 Dec 03 12:47:46 crc kubenswrapper[4666]: I1203 12:47:46.982999 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.775731198 podStartE2EDuration="5.982971544s" podCreationTimestamp="2025-12-03 12:47:41 +0000 UTC" firstStartedPulling="2025-12-03 12:47:42.782795507 +0000 UTC m=+2051.627756558" lastFinishedPulling="2025-12-03 12:47:45.990035853 +0000 UTC m=+2054.834996904" observedRunningTime="2025-12-03 12:47:46.977083195 +0000 UTC m=+2055.822044276" watchObservedRunningTime="2025-12-03 12:47:46.982971544 +0000 UTC m=+2055.827932635" Dec 03 12:47:46 crc kubenswrapper[4666]: I1203 12:47:46.988913 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"11dc853c-6a55-4aeb-aba5-0c55faebf8ff","Type":"ContainerStarted","Data":"aa1513804e384e3ed08fa1815f323e5f07ea71b4ae1db8b651019fe08f17718a"} Dec 03 12:47:46 crc kubenswrapper[4666]: I1203 12:47:46.988959 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"11dc853c-6a55-4aeb-aba5-0c55faebf8ff","Type":"ContainerStarted","Data":"8d12ec2871170eafef694b63ac74ecf14896e1bd7a35e7e5fbd044ada6fb0320"} Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.009217 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" event={"ID":"ab129ab9-27f4-43c5-aca6-397236fb03c1","Type":"ContainerStarted","Data":"1724b4cce6e821b8de73dc7074f9b90ddd03803628e7914a2bd8d94ac6fe276a"} Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.009470 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.010510 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.200388308 podStartE2EDuration="5.010490487s" podCreationTimestamp="2025-12-03 12:47:42 +0000 UTC" firstStartedPulling="2025-12-03 12:47:43.182575295 +0000 UTC m=+2052.027536336" lastFinishedPulling="2025-12-03 12:47:45.992677464 +0000 UTC m=+2054.837638515" observedRunningTime="2025-12-03 12:47:47.008879104 +0000 UTC m=+2055.853840155" watchObservedRunningTime="2025-12-03 12:47:47.010490487 +0000 UTC m=+2055.855451548" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.016594 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7589d785-1f6d-46cb-93e3-61e2bd28de2a","Type":"ContainerStarted","Data":"b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8"} Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.016642 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7589d785-1f6d-46cb-93e3-61e2bd28de2a","Type":"ContainerStarted","Data":"edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d"} Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.016757 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-log" containerID="cri-o://edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d" gracePeriod=30 Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.017016 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-metadata" containerID="cri-o://b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8" gracePeriod=30 Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.048770 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7328715900000002 podStartE2EDuration="6.048753081s" podCreationTimestamp="2025-12-03 12:47:41 +0000 UTC" firstStartedPulling="2025-12-03 12:47:42.671261014 +0000 UTC m=+2051.516222065" lastFinishedPulling="2025-12-03 12:47:45.987142505 +0000 UTC m=+2054.832103556" observedRunningTime="2025-12-03 12:47:47.035434811 +0000 UTC m=+2055.880395872" watchObservedRunningTime="2025-12-03 12:47:47.048753081 +0000 UTC m=+2055.893714132" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.059971 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" podStartSLOduration=5.059956604 podStartE2EDuration="5.059956604s" podCreationTimestamp="2025-12-03 12:47:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:47.058492704 +0000 UTC m=+2055.903453775" watchObservedRunningTime="2025-12-03 12:47:47.059956604 +0000 UTC m=+2055.904917655" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.094292 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.156509914 podStartE2EDuration="5.0942673s" podCreationTimestamp="2025-12-03 12:47:42 +0000 UTC" firstStartedPulling="2025-12-03 12:47:43.052479352 +0000 UTC m=+2051.897440423" lastFinishedPulling="2025-12-03 12:47:45.990236758 +0000 UTC m=+2054.835197809" observedRunningTime="2025-12-03 12:47:47.077243561 +0000 UTC m=+2055.922204612" watchObservedRunningTime="2025-12-03 12:47:47.0942673 +0000 UTC m=+2055.939228351" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.282115 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.522278 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.522367 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 12:47:47 crc kubenswrapper[4666]: I1203 12:47:47.564566 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:47:48 crc kubenswrapper[4666]: I1203 12:47:48.030395 4666 generic.go:334] "Generic (PLEG): container finished" podID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerID="edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d" exitCode=143 Dec 03 12:47:48 crc kubenswrapper[4666]: I1203 12:47:48.031863 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7589d785-1f6d-46cb-93e3-61e2bd28de2a","Type":"ContainerDied","Data":"edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d"} Dec 03 12:47:49 crc kubenswrapper[4666]: I1203 12:47:49.023464 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 12:47:51 crc kubenswrapper[4666]: I1203 12:47:51.058534 4666 generic.go:334] "Generic (PLEG): container finished" podID="8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" containerID="33fdfc004fbd6f16adf499e9c395bf05d28f9b38aa0ca16dfbee30e93a2ecc11" exitCode=0 Dec 03 12:47:51 crc kubenswrapper[4666]: I1203 12:47:51.058817 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5m5fc" event={"ID":"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7","Type":"ContainerDied","Data":"33fdfc004fbd6f16adf499e9c395bf05d28f9b38aa0ca16dfbee30e93a2ecc11"} Dec 03 12:47:51 crc kubenswrapper[4666]: I1203 12:47:51.062303 4666 generic.go:334] "Generic (PLEG): container finished" podID="1b702d3b-190d-48dc-8ee0-531b9d6f712b" containerID="2ac794b5f11600be905c18b71906ddabf9c548c4f251843c9d8d1e93b2b70987" exitCode=0 Dec 03 12:47:51 crc kubenswrapper[4666]: I1203 12:47:51.062367 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sts7r" event={"ID":"1b702d3b-190d-48dc-8ee0-531b9d6f712b","Type":"ContainerDied","Data":"2ac794b5f11600be905c18b71906ddabf9c548c4f251843c9d8d1e93b2b70987"} Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.282485 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.303660 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.303717 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.326212 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.537044 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.542245 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.574285 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629162 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-config-data\") pod \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629246 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-scripts\") pod \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629305 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-combined-ca-bundle\") pod \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629377 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-config-data\") pod \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629420 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-combined-ca-bundle\") pod \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629468 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-scripts\") pod \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629538 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv275\" (UniqueName: \"kubernetes.io/projected/1b702d3b-190d-48dc-8ee0-531b9d6f712b-kube-api-access-sv275\") pod \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\" (UID: \"1b702d3b-190d-48dc-8ee0-531b9d6f712b\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.629588 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljmq8\" (UniqueName: \"kubernetes.io/projected/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-kube-api-access-ljmq8\") pod \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\" (UID: \"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7\") " Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.637378 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b702d3b-190d-48dc-8ee0-531b9d6f712b-kube-api-access-sv275" (OuterVolumeSpecName: "kube-api-access-sv275") pod "1b702d3b-190d-48dc-8ee0-531b9d6f712b" (UID: "1b702d3b-190d-48dc-8ee0-531b9d6f712b"). InnerVolumeSpecName "kube-api-access-sv275". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.652365 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-kube-api-access-ljmq8" (OuterVolumeSpecName: "kube-api-access-ljmq8") pod "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" (UID: "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7"). InnerVolumeSpecName "kube-api-access-ljmq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.662040 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-76cj6"] Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.662307 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" podUID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerName="dnsmasq-dns" containerID="cri-o://a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae" gracePeriod=10 Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.676329 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-scripts" (OuterVolumeSpecName: "scripts") pod "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" (UID: "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.682809 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" (UID: "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.685175 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-scripts" (OuterVolumeSpecName: "scripts") pod "1b702d3b-190d-48dc-8ee0-531b9d6f712b" (UID: "1b702d3b-190d-48dc-8ee0-531b9d6f712b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.706496 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-config-data" (OuterVolumeSpecName: "config-data") pod "1b702d3b-190d-48dc-8ee0-531b9d6f712b" (UID: "1b702d3b-190d-48dc-8ee0-531b9d6f712b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.718837 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-config-data" (OuterVolumeSpecName: "config-data") pod "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" (UID: "8d0002d8-cf2c-4ad2-a464-f1cff29a03d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.727871 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b702d3b-190d-48dc-8ee0-531b9d6f712b" (UID: "1b702d3b-190d-48dc-8ee0-531b9d6f712b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733867 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733910 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv275\" (UniqueName: \"kubernetes.io/projected/1b702d3b-190d-48dc-8ee0-531b9d6f712b-kube-api-access-sv275\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733927 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljmq8\" (UniqueName: \"kubernetes.io/projected/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-kube-api-access-ljmq8\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733938 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733950 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733960 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733972 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:52 crc kubenswrapper[4666]: I1203 12:47:52.733982 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b702d3b-190d-48dc-8ee0-531b9d6f712b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.067294 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.086060 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5m5fc" event={"ID":"8d0002d8-cf2c-4ad2-a464-f1cff29a03d7","Type":"ContainerDied","Data":"9533329919b1f410f230b910f24c86cdf34123fbc0e19964f7fb4811273a3ad7"} Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.086128 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9533329919b1f410f230b910f24c86cdf34123fbc0e19964f7fb4811273a3ad7" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.086097 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5m5fc" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.093045 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sts7r" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.093443 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sts7r" event={"ID":"1b702d3b-190d-48dc-8ee0-531b9d6f712b","Type":"ContainerDied","Data":"f6999a3dfb70d47887156f708f65c61db237443f24abb2738417659fca5f46e3"} Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.093499 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6999a3dfb70d47887156f708f65c61db237443f24abb2738417659fca5f46e3" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.101699 4666 generic.go:334] "Generic (PLEG): container finished" podID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerID="a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae" exitCode=0 Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.103642 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.105915 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" event={"ID":"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b","Type":"ContainerDied","Data":"a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae"} Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.107612 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-76cj6" event={"ID":"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b","Type":"ContainerDied","Data":"4ab63e07113c48a9a09e2b5a263ae0c8bf04e861b10eaa8df1a0c0b137a3afef"} Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.107830 4666 scope.go:117] "RemoveContainer" containerID="a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.143875 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-nb\") pod \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.143981 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-sb\") pod \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.144020 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dk8f\" (UniqueName: \"kubernetes.io/projected/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-kube-api-access-4dk8f\") pod \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.144054 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-config\") pod \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.144175 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-dns-svc\") pod \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\" (UID: \"cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b\") " Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.154335 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-kube-api-access-4dk8f" (OuterVolumeSpecName: "kube-api-access-4dk8f") pod "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" (UID: "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b"). InnerVolumeSpecName "kube-api-access-4dk8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.190678 4666 scope.go:117] "RemoveContainer" containerID="36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.208174 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 12:47:53 crc kubenswrapper[4666]: E1203 12:47:53.208710 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" containerName="nova-manage" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.208729 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" containerName="nova-manage" Dec 03 12:47:53 crc kubenswrapper[4666]: E1203 12:47:53.208748 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b702d3b-190d-48dc-8ee0-531b9d6f712b" containerName="nova-cell1-conductor-db-sync" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.208755 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b702d3b-190d-48dc-8ee0-531b9d6f712b" containerName="nova-cell1-conductor-db-sync" Dec 03 12:47:53 crc kubenswrapper[4666]: E1203 12:47:53.208765 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerName="init" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.208772 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerName="init" Dec 03 12:47:53 crc kubenswrapper[4666]: E1203 12:47:53.208783 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerName="dnsmasq-dns" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.208789 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerName="dnsmasq-dns" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.208988 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" containerName="dnsmasq-dns" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.209000 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b702d3b-190d-48dc-8ee0-531b9d6f712b" containerName="nova-cell1-conductor-db-sync" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.209019 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" containerName="nova-manage" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.209807 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.209896 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.216939 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.217047 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.229606 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" (UID: "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.231969 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-config" (OuterVolumeSpecName: "config") pod "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" (UID: "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.233833 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" (UID: "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.238859 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" (UID: "cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.246381 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.246405 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.246415 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.246425 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dk8f\" (UniqueName: \"kubernetes.io/projected/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-kube-api-access-4dk8f\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.246432 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.263673 4666 scope.go:117] "RemoveContainer" containerID="a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae" Dec 03 12:47:53 crc kubenswrapper[4666]: E1203 12:47:53.264471 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae\": container with ID starting with a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae not found: ID does not exist" containerID="a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.264587 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae"} err="failed to get container status \"a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae\": rpc error: code = NotFound desc = could not find container \"a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae\": container with ID starting with a5b5527bf4f9fc8319d07bc3d01c2118a1bff03e666956cc9ecdb0573a8980ae not found: ID does not exist" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.264705 4666 scope.go:117] "RemoveContainer" containerID="36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc" Dec 03 12:47:53 crc kubenswrapper[4666]: E1203 12:47:53.265211 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc\": container with ID starting with 36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc not found: ID does not exist" containerID="36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.265239 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc"} err="failed to get container status \"36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc\": rpc error: code = NotFound desc = could not find container \"36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc\": container with ID starting with 36922782e5fdd5514324e39808e9e1c76c3e776ed887263e0efa5a4b69a611cc not found: ID does not exist" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.319454 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.319673 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-log" containerID="cri-o://8d12ec2871170eafef694b63ac74ecf14896e1bd7a35e7e5fbd044ada6fb0320" gracePeriod=30 Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.320286 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-api" containerID="cri-o://aa1513804e384e3ed08fa1815f323e5f07ea71b4ae1db8b651019fe08f17718a" gracePeriod=30 Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.330603 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.330693 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.348234 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pblr2\" (UniqueName: \"kubernetes.io/projected/31efcd1f-210a-408a-b711-6862b6537a7d-kube-api-access-pblr2\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.348282 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31efcd1f-210a-408a-b711-6862b6537a7d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.348381 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31efcd1f-210a-408a-b711-6862b6537a7d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.450179 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31efcd1f-210a-408a-b711-6862b6537a7d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.450273 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pblr2\" (UniqueName: \"kubernetes.io/projected/31efcd1f-210a-408a-b711-6862b6537a7d-kube-api-access-pblr2\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.450294 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31efcd1f-210a-408a-b711-6862b6537a7d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.457866 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31efcd1f-210a-408a-b711-6862b6537a7d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.457956 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31efcd1f-210a-408a-b711-6862b6537a7d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.472802 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pblr2\" (UniqueName: \"kubernetes.io/projected/31efcd1f-210a-408a-b711-6862b6537a7d-kube-api-access-pblr2\") pod \"nova-cell1-conductor-0\" (UID: \"31efcd1f-210a-408a-b711-6862b6537a7d\") " pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.539948 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.553557 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-76cj6"] Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.561514 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-76cj6"] Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.787698 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:53 crc kubenswrapper[4666]: I1203 12:47:53.994046 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 12:47:54 crc kubenswrapper[4666]: I1203 12:47:54.119241 4666 generic.go:334] "Generic (PLEG): container finished" podID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerID="8d12ec2871170eafef694b63ac74ecf14896e1bd7a35e7e5fbd044ada6fb0320" exitCode=143 Dec 03 12:47:54 crc kubenswrapper[4666]: I1203 12:47:54.119514 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"11dc853c-6a55-4aeb-aba5-0c55faebf8ff","Type":"ContainerDied","Data":"8d12ec2871170eafef694b63ac74ecf14896e1bd7a35e7e5fbd044ada6fb0320"} Dec 03 12:47:54 crc kubenswrapper[4666]: I1203 12:47:54.121592 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31efcd1f-210a-408a-b711-6862b6537a7d","Type":"ContainerStarted","Data":"24a1a06779309dab42ec903d870ad8415a7e305adb1dc2ad39bd8af3a8244313"} Dec 03 12:47:55 crc kubenswrapper[4666]: I1203 12:47:55.129306 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31efcd1f-210a-408a-b711-6862b6537a7d","Type":"ContainerStarted","Data":"5680f43a2de2fd2d9d9b2c3de6672a0ebc62d98284b0ac479af698fb0d8d476f"} Dec 03 12:47:55 crc kubenswrapper[4666]: I1203 12:47:55.129402 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ba6731ba-a089-4020-b248-ea92fe29aa86" containerName="nova-scheduler-scheduler" containerID="cri-o://8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" gracePeriod=30 Dec 03 12:47:55 crc kubenswrapper[4666]: I1203 12:47:55.153298 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.153280156 podStartE2EDuration="2.153280156s" podCreationTimestamp="2025-12-03 12:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:47:55.143345327 +0000 UTC m=+2063.988306398" watchObservedRunningTime="2025-12-03 12:47:55.153280156 +0000 UTC m=+2063.998241207" Dec 03 12:47:55 crc kubenswrapper[4666]: I1203 12:47:55.435029 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b" path="/var/lib/kubelet/pods/cdbf81e4-0170-4cd0-bc4b-6a1194bc6c6b/volumes" Dec 03 12:47:56 crc kubenswrapper[4666]: I1203 12:47:56.143554 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 12:47:57 crc kubenswrapper[4666]: E1203 12:47:57.283993 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 12:47:57 crc kubenswrapper[4666]: E1203 12:47:57.285975 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 12:47:57 crc kubenswrapper[4666]: E1203 12:47:57.287918 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 12:47:57 crc kubenswrapper[4666]: E1203 12:47:57.287990 4666 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ba6731ba-a089-4020-b248-ea92fe29aa86" containerName="nova-scheduler-scheduler" Dec 03 12:47:57 crc kubenswrapper[4666]: I1203 12:47:57.991918 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.035778 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-config-data\") pod \"ba6731ba-a089-4020-b248-ea92fe29aa86\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.036294 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-combined-ca-bundle\") pod \"ba6731ba-a089-4020-b248-ea92fe29aa86\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.036383 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqbrv\" (UniqueName: \"kubernetes.io/projected/ba6731ba-a089-4020-b248-ea92fe29aa86-kube-api-access-wqbrv\") pod \"ba6731ba-a089-4020-b248-ea92fe29aa86\" (UID: \"ba6731ba-a089-4020-b248-ea92fe29aa86\") " Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.041920 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6731ba-a089-4020-b248-ea92fe29aa86-kube-api-access-wqbrv" (OuterVolumeSpecName: "kube-api-access-wqbrv") pod "ba6731ba-a089-4020-b248-ea92fe29aa86" (UID: "ba6731ba-a089-4020-b248-ea92fe29aa86"). InnerVolumeSpecName "kube-api-access-wqbrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.061434 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba6731ba-a089-4020-b248-ea92fe29aa86" (UID: "ba6731ba-a089-4020-b248-ea92fe29aa86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.067280 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-config-data" (OuterVolumeSpecName: "config-data") pod "ba6731ba-a089-4020-b248-ea92fe29aa86" (UID: "ba6731ba-a089-4020-b248-ea92fe29aa86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.138425 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.138462 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqbrv\" (UniqueName: \"kubernetes.io/projected/ba6731ba-a089-4020-b248-ea92fe29aa86-kube-api-access-wqbrv\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.138475 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6731ba-a089-4020-b248-ea92fe29aa86-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.167622 4666 generic.go:334] "Generic (PLEG): container finished" podID="ba6731ba-a089-4020-b248-ea92fe29aa86" containerID="8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" exitCode=0 Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.167674 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba6731ba-a089-4020-b248-ea92fe29aa86","Type":"ContainerDied","Data":"8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483"} Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.167691 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.167717 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba6731ba-a089-4020-b248-ea92fe29aa86","Type":"ContainerDied","Data":"86ba7109a13e4f2d49430d6aae4f03c65db2806f06173f623d2c4add07d49f12"} Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.167736 4666 scope.go:117] "RemoveContainer" containerID="8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.197189 4666 scope.go:117] "RemoveContainer" containerID="8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" Dec 03 12:47:58 crc kubenswrapper[4666]: E1203 12:47:58.198528 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483\": container with ID starting with 8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483 not found: ID does not exist" containerID="8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.198569 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483"} err="failed to get container status \"8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483\": rpc error: code = NotFound desc = could not find container \"8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483\": container with ID starting with 8908e5f5144f7e876f70795a5cd2ab029f12d670ce7acffec78a4c0663697483 not found: ID does not exist" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.239159 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.255150 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.261160 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:58 crc kubenswrapper[4666]: E1203 12:47:58.261514 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6731ba-a089-4020-b248-ea92fe29aa86" containerName="nova-scheduler-scheduler" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.261530 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6731ba-a089-4020-b248-ea92fe29aa86" containerName="nova-scheduler-scheduler" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.261731 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6731ba-a089-4020-b248-ea92fe29aa86" containerName="nova-scheduler-scheduler" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.262295 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.267533 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.283231 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.341570 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.341694 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-config-data\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.341735 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tccf2\" (UniqueName: \"kubernetes.io/projected/67e03c9f-9729-44b8-954e-128a4829d47c-kube-api-access-tccf2\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.443072 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tccf2\" (UniqueName: \"kubernetes.io/projected/67e03c9f-9729-44b8-954e-128a4829d47c-kube-api-access-tccf2\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.443176 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.443426 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-config-data\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.447133 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.447142 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-config-data\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.458922 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tccf2\" (UniqueName: \"kubernetes.io/projected/67e03c9f-9729-44b8-954e-128a4829d47c-kube-api-access-tccf2\") pod \"nova-scheduler-0\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " pod="openstack/nova-scheduler-0" Dec 03 12:47:58 crc kubenswrapper[4666]: I1203 12:47:58.584205 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.023560 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.177676 4666 generic.go:334] "Generic (PLEG): container finished" podID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerID="aa1513804e384e3ed08fa1815f323e5f07ea71b4ae1db8b651019fe08f17718a" exitCode=0 Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.177754 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"11dc853c-6a55-4aeb-aba5-0c55faebf8ff","Type":"ContainerDied","Data":"aa1513804e384e3ed08fa1815f323e5f07ea71b4ae1db8b651019fe08f17718a"} Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.178034 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"11dc853c-6a55-4aeb-aba5-0c55faebf8ff","Type":"ContainerDied","Data":"623449d7b2140dd653d3721b1372a30926e9fc3bf4c7dfbb545cdc4deaf04e4c"} Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.178173 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623449d7b2140dd653d3721b1372a30926e9fc3bf4c7dfbb545cdc4deaf04e4c" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.179186 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67e03c9f-9729-44b8-954e-128a4829d47c","Type":"ContainerStarted","Data":"0f4b7b81761250524ce6d92e438e1e23dca11e8b396346204e72a1dd0d10d495"} Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.181032 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.257287 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skwdf\" (UniqueName: \"kubernetes.io/projected/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-kube-api-access-skwdf\") pod \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.257427 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-combined-ca-bundle\") pod \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.257465 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-logs\") pod \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.257538 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-config-data\") pod \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\" (UID: \"11dc853c-6a55-4aeb-aba5-0c55faebf8ff\") " Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.258184 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-logs" (OuterVolumeSpecName: "logs") pod "11dc853c-6a55-4aeb-aba5-0c55faebf8ff" (UID: "11dc853c-6a55-4aeb-aba5-0c55faebf8ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.260901 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-kube-api-access-skwdf" (OuterVolumeSpecName: "kube-api-access-skwdf") pod "11dc853c-6a55-4aeb-aba5-0c55faebf8ff" (UID: "11dc853c-6a55-4aeb-aba5-0c55faebf8ff"). InnerVolumeSpecName "kube-api-access-skwdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.287843 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11dc853c-6a55-4aeb-aba5-0c55faebf8ff" (UID: "11dc853c-6a55-4aeb-aba5-0c55faebf8ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.298142 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-config-data" (OuterVolumeSpecName: "config-data") pod "11dc853c-6a55-4aeb-aba5-0c55faebf8ff" (UID: "11dc853c-6a55-4aeb-aba5-0c55faebf8ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.359820 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skwdf\" (UniqueName: \"kubernetes.io/projected/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-kube-api-access-skwdf\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.359868 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.359883 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.359894 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dc853c-6a55-4aeb-aba5-0c55faebf8ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:47:59 crc kubenswrapper[4666]: I1203 12:47:59.434295 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6731ba-a089-4020-b248-ea92fe29aa86" path="/var/lib/kubelet/pods/ba6731ba-a089-4020-b248-ea92fe29aa86/volumes" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.192314 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.192332 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67e03c9f-9729-44b8-954e-128a4829d47c","Type":"ContainerStarted","Data":"eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0"} Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.245113 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.245073809 podStartE2EDuration="2.245073809s" podCreationTimestamp="2025-12-03 12:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:00.219625932 +0000 UTC m=+2069.064586983" watchObservedRunningTime="2025-12-03 12:48:00.245073809 +0000 UTC m=+2069.090034880" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.268960 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.282857 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.294853 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:00 crc kubenswrapper[4666]: E1203 12:48:00.295586 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-log" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.295613 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-log" Dec 03 12:48:00 crc kubenswrapper[4666]: E1203 12:48:00.295645 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-api" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.295656 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-api" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.295938 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-log" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.295965 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" containerName="nova-api-api" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.297611 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.300761 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.308078 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.388676 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.388729 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27e2b2a8-a5c9-4847-9116-f23895f6ef64-logs\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.388792 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft7k9\" (UniqueName: \"kubernetes.io/projected/27e2b2a8-a5c9-4847-9116-f23895f6ef64-kube-api-access-ft7k9\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.388817 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-config-data\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.490512 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.490604 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27e2b2a8-a5c9-4847-9116-f23895f6ef64-logs\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.490703 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft7k9\" (UniqueName: \"kubernetes.io/projected/27e2b2a8-a5c9-4847-9116-f23895f6ef64-kube-api-access-ft7k9\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.490750 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-config-data\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.494014 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27e2b2a8-a5c9-4847-9116-f23895f6ef64-logs\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.498657 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-config-data\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.511961 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft7k9\" (UniqueName: \"kubernetes.io/projected/27e2b2a8-a5c9-4847-9116-f23895f6ef64-kube-api-access-ft7k9\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.513034 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " pod="openstack/nova-api-0" Dec 03 12:48:00 crc kubenswrapper[4666]: I1203 12:48:00.612598 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:01 crc kubenswrapper[4666]: I1203 12:48:01.126205 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:01 crc kubenswrapper[4666]: W1203 12:48:01.130858 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27e2b2a8_a5c9_4847_9116_f23895f6ef64.slice/crio-52cd25a174fe65a24d6889f7ba89762853a8a7c321d9509190b4636fb1148e2f WatchSource:0}: Error finding container 52cd25a174fe65a24d6889f7ba89762853a8a7c321d9509190b4636fb1148e2f: Status 404 returned error can't find the container with id 52cd25a174fe65a24d6889f7ba89762853a8a7c321d9509190b4636fb1148e2f Dec 03 12:48:01 crc kubenswrapper[4666]: I1203 12:48:01.202201 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27e2b2a8-a5c9-4847-9116-f23895f6ef64","Type":"ContainerStarted","Data":"52cd25a174fe65a24d6889f7ba89762853a8a7c321d9509190b4636fb1148e2f"} Dec 03 12:48:01 crc kubenswrapper[4666]: I1203 12:48:01.458567 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11dc853c-6a55-4aeb-aba5-0c55faebf8ff" path="/var/lib/kubelet/pods/11dc853c-6a55-4aeb-aba5-0c55faebf8ff/volumes" Dec 03 12:48:02 crc kubenswrapper[4666]: I1203 12:48:02.216376 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27e2b2a8-a5c9-4847-9116-f23895f6ef64","Type":"ContainerStarted","Data":"f8656f33884cc910982e58a86c1461e8b46a48f97ca902fed2f4a4ee012603f5"} Dec 03 12:48:02 crc kubenswrapper[4666]: I1203 12:48:02.216445 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27e2b2a8-a5c9-4847-9116-f23895f6ef64","Type":"ContainerStarted","Data":"81f514f62bbee44890c51713efc64ca9e1440e88210594251a57709dc0fa07e9"} Dec 03 12:48:02 crc kubenswrapper[4666]: I1203 12:48:02.259615 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.259591406 podStartE2EDuration="2.259591406s" podCreationTimestamp="2025-12-03 12:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:02.244428477 +0000 UTC m=+2071.089389578" watchObservedRunningTime="2025-12-03 12:48:02.259591406 +0000 UTC m=+2071.104552467" Dec 03 12:48:03 crc kubenswrapper[4666]: I1203 12:48:03.576970 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 12:48:03 crc kubenswrapper[4666]: I1203 12:48:03.585354 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 12:48:08 crc kubenswrapper[4666]: I1203 12:48:08.584342 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 12:48:08 crc kubenswrapper[4666]: I1203 12:48:08.626439 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 12:48:09 crc kubenswrapper[4666]: I1203 12:48:09.319447 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 12:48:10 crc kubenswrapper[4666]: I1203 12:48:10.614156 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 12:48:10 crc kubenswrapper[4666]: I1203 12:48:10.614469 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 12:48:11 crc kubenswrapper[4666]: I1203 12:48:11.697310 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:11 crc kubenswrapper[4666]: I1203 12:48:11.697646 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.363287 4666 generic.go:334] "Generic (PLEG): container finished" podID="4dddd9cb-a490-4862-b1c5-3f581891ba57" containerID="dcf2ec72c6f2206913c9e25a1eaf0ee89abfddf1153b8faaa1e445ba73cb9ca9" exitCode=137 Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.363344 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dddd9cb-a490-4862-b1c5-3f581891ba57","Type":"ContainerDied","Data":"dcf2ec72c6f2206913c9e25a1eaf0ee89abfddf1153b8faaa1e445ba73cb9ca9"} Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.732131 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.875787 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-combined-ca-bundle\") pod \"4dddd9cb-a490-4862-b1c5-3f581891ba57\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.876029 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p42b\" (UniqueName: \"kubernetes.io/projected/4dddd9cb-a490-4862-b1c5-3f581891ba57-kube-api-access-7p42b\") pod \"4dddd9cb-a490-4862-b1c5-3f581891ba57\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.876076 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-config-data\") pod \"4dddd9cb-a490-4862-b1c5-3f581891ba57\" (UID: \"4dddd9cb-a490-4862-b1c5-3f581891ba57\") " Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.881740 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dddd9cb-a490-4862-b1c5-3f581891ba57-kube-api-access-7p42b" (OuterVolumeSpecName: "kube-api-access-7p42b") pod "4dddd9cb-a490-4862-b1c5-3f581891ba57" (UID: "4dddd9cb-a490-4862-b1c5-3f581891ba57"). InnerVolumeSpecName "kube-api-access-7p42b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.912030 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-config-data" (OuterVolumeSpecName: "config-data") pod "4dddd9cb-a490-4862-b1c5-3f581891ba57" (UID: "4dddd9cb-a490-4862-b1c5-3f581891ba57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.920984 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dddd9cb-a490-4862-b1c5-3f581891ba57" (UID: "4dddd9cb-a490-4862-b1c5-3f581891ba57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.978304 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.978343 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p42b\" (UniqueName: \"kubernetes.io/projected/4dddd9cb-a490-4862-b1c5-3f581891ba57-kube-api-access-7p42b\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.978358 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dddd9cb-a490-4862-b1c5-3f581891ba57-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:17 crc kubenswrapper[4666]: I1203 12:48:17.981928 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.079019 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-config-data\") pod \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.079117 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn7ml\" (UniqueName: \"kubernetes.io/projected/7589d785-1f6d-46cb-93e3-61e2bd28de2a-kube-api-access-bn7ml\") pod \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.079172 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7589d785-1f6d-46cb-93e3-61e2bd28de2a-logs\") pod \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.079225 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-combined-ca-bundle\") pod \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\" (UID: \"7589d785-1f6d-46cb-93e3-61e2bd28de2a\") " Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.079569 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7589d785-1f6d-46cb-93e3-61e2bd28de2a-logs" (OuterVolumeSpecName: "logs") pod "7589d785-1f6d-46cb-93e3-61e2bd28de2a" (UID: "7589d785-1f6d-46cb-93e3-61e2bd28de2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.079827 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7589d785-1f6d-46cb-93e3-61e2bd28de2a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.089379 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7589d785-1f6d-46cb-93e3-61e2bd28de2a-kube-api-access-bn7ml" (OuterVolumeSpecName: "kube-api-access-bn7ml") pod "7589d785-1f6d-46cb-93e3-61e2bd28de2a" (UID: "7589d785-1f6d-46cb-93e3-61e2bd28de2a"). InnerVolumeSpecName "kube-api-access-bn7ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.118204 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-config-data" (OuterVolumeSpecName: "config-data") pod "7589d785-1f6d-46cb-93e3-61e2bd28de2a" (UID: "7589d785-1f6d-46cb-93e3-61e2bd28de2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.137239 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7589d785-1f6d-46cb-93e3-61e2bd28de2a" (UID: "7589d785-1f6d-46cb-93e3-61e2bd28de2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.181724 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.181758 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7589d785-1f6d-46cb-93e3-61e2bd28de2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.181767 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn7ml\" (UniqueName: \"kubernetes.io/projected/7589d785-1f6d-46cb-93e3-61e2bd28de2a-kube-api-access-bn7ml\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.377692 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dddd9cb-a490-4862-b1c5-3f581891ba57","Type":"ContainerDied","Data":"62e5793691cd0b6bf40f1dbe6f8790632c943416ef09d17405498155faf5f06e"} Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.378435 4666 scope.go:117] "RemoveContainer" containerID="dcf2ec72c6f2206913c9e25a1eaf0ee89abfddf1153b8faaa1e445ba73cb9ca9" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.377823 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.382547 4666 generic.go:334] "Generic (PLEG): container finished" podID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerID="b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8" exitCode=137 Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.382606 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7589d785-1f6d-46cb-93e3-61e2bd28de2a","Type":"ContainerDied","Data":"b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8"} Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.382617 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.382647 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7589d785-1f6d-46cb-93e3-61e2bd28de2a","Type":"ContainerDied","Data":"eb963bdd30a0e7c8a50bc5d94f6c636a3ae3e772e2eda0f38f311119d0353d07"} Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.420875 4666 scope.go:117] "RemoveContainer" containerID="b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.433676 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.453207 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.471386 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.478429 4666 scope.go:117] "RemoveContainer" containerID="edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.504994 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.517484 4666 scope.go:117] "RemoveContainer" containerID="b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8" Dec 03 12:48:18 crc kubenswrapper[4666]: E1203 12:48:18.525549 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8\": container with ID starting with b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8 not found: ID does not exist" containerID="b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.525605 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8"} err="failed to get container status \"b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8\": rpc error: code = NotFound desc = could not find container \"b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8\": container with ID starting with b9d03aaa2d0715ee3481dab6d692ea44e8ffaeca3aaf991f04f1685dbe6f11f8 not found: ID does not exist" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.525643 4666 scope.go:117] "RemoveContainer" containerID="edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d" Dec 03 12:48:18 crc kubenswrapper[4666]: E1203 12:48:18.526240 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d\": container with ID starting with edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d not found: ID does not exist" containerID="edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.526317 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d"} err="failed to get container status \"edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d\": rpc error: code = NotFound desc = could not find container \"edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d\": container with ID starting with edb7093f9c7d4ba19314e431dba62282bde249b4505936599af25202c3552c4d not found: ID does not exist" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.535403 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: E1203 12:48:18.535916 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-log" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.535943 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-log" Dec 03 12:48:18 crc kubenswrapper[4666]: E1203 12:48:18.535960 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dddd9cb-a490-4862-b1c5-3f581891ba57" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.535970 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dddd9cb-a490-4862-b1c5-3f581891ba57" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 12:48:18 crc kubenswrapper[4666]: E1203 12:48:18.535990 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-metadata" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.535999 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-metadata" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.536254 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-metadata" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.536279 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" containerName="nova-metadata-log" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.536296 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dddd9cb-a490-4862-b1c5-3f581891ba57" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.537148 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.541565 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.541803 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.546023 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.555402 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.568055 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.569427 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.571331 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.572651 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.582856 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.710835 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-config-data\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.710881 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.710998 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.711056 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.711136 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-logs\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.711163 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.711220 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kw7\" (UniqueName: \"kubernetes.io/projected/22c93a41-9194-4d6c-a77a-3310870cb513-kube-api-access-r9kw7\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.711270 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.711356 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nc5v\" (UniqueName: \"kubernetes.io/projected/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-kube-api-access-7nc5v\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.711404 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813455 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-config-data\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813506 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813554 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813579 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813614 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-logs\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813629 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813650 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kw7\" (UniqueName: \"kubernetes.io/projected/22c93a41-9194-4d6c-a77a-3310870cb513-kube-api-access-r9kw7\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813670 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813713 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nc5v\" (UniqueName: \"kubernetes.io/projected/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-kube-api-access-7nc5v\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.813741 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.815141 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-logs\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.821630 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.822432 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.822836 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.824318 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.824920 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.827849 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c93a41-9194-4d6c-a77a-3310870cb513-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.828660 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-config-data\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.839486 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nc5v\" (UniqueName: \"kubernetes.io/projected/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-kube-api-access-7nc5v\") pod \"nova-metadata-0\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " pod="openstack/nova-metadata-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.845063 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kw7\" (UniqueName: \"kubernetes.io/projected/22c93a41-9194-4d6c-a77a-3310870cb513-kube-api-access-r9kw7\") pod \"nova-cell1-novncproxy-0\" (UID: \"22c93a41-9194-4d6c-a77a-3310870cb513\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.857011 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:18 crc kubenswrapper[4666]: I1203 12:48:18.890646 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:19 crc kubenswrapper[4666]: I1203 12:48:19.141022 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 12:48:19 crc kubenswrapper[4666]: I1203 12:48:19.393134 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"22c93a41-9194-4d6c-a77a-3310870cb513","Type":"ContainerStarted","Data":"049ac1ecae408fe6b774a47acc997cfe25b6ecdd6272186f40c70a9eaa5b9986"} Dec 03 12:48:19 crc kubenswrapper[4666]: I1203 12:48:19.394247 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"22c93a41-9194-4d6c-a77a-3310870cb513","Type":"ContainerStarted","Data":"070b07b0e4fc8d4c3f2cb9c6943249bf32659b25a0f38ba04e0639ac6eeae531"} Dec 03 12:48:19 crc kubenswrapper[4666]: I1203 12:48:19.424784 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.424759412 podStartE2EDuration="1.424759412s" podCreationTimestamp="2025-12-03 12:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:19.415355579 +0000 UTC m=+2088.260316660" watchObservedRunningTime="2025-12-03 12:48:19.424759412 +0000 UTC m=+2088.269720483" Dec 03 12:48:19 crc kubenswrapper[4666]: I1203 12:48:19.434457 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dddd9cb-a490-4862-b1c5-3f581891ba57" path="/var/lib/kubelet/pods/4dddd9cb-a490-4862-b1c5-3f581891ba57/volumes" Dec 03 12:48:19 crc kubenswrapper[4666]: I1203 12:48:19.435223 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7589d785-1f6d-46cb-93e3-61e2bd28de2a" path="/var/lib/kubelet/pods/7589d785-1f6d-46cb-93e3-61e2bd28de2a/volumes" Dec 03 12:48:19 crc kubenswrapper[4666]: W1203 12:48:19.438250 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c5eed6_fb78_4116_87c2_83fb558fe6d3.slice/crio-a019fb2c5f7d4b4326dceaf8c4de1a6e8fafca34e5324442319fcd6c71ae872d WatchSource:0}: Error finding container a019fb2c5f7d4b4326dceaf8c4de1a6e8fafca34e5324442319fcd6c71ae872d: Status 404 returned error can't find the container with id a019fb2c5f7d4b4326dceaf8c4de1a6e8fafca34e5324442319fcd6c71ae872d Dec 03 12:48:19 crc kubenswrapper[4666]: I1203 12:48:19.438742 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.406466 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7c5eed6-fb78-4116-87c2-83fb558fe6d3","Type":"ContainerStarted","Data":"448b08431b3789595853f49f578055b843862d54ca093a70196b97a47e0f68d6"} Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.406820 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7c5eed6-fb78-4116-87c2-83fb558fe6d3","Type":"ContainerStarted","Data":"1434cec8a98ca76f74d5b92296a4fe1e43e4250f6aa8a0e51ef625e08cbf13f0"} Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.406843 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7c5eed6-fb78-4116-87c2-83fb558fe6d3","Type":"ContainerStarted","Data":"a019fb2c5f7d4b4326dceaf8c4de1a6e8fafca34e5324442319fcd6c71ae872d"} Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.431443 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.431422639 podStartE2EDuration="2.431422639s" podCreationTimestamp="2025-12-03 12:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:20.424845522 +0000 UTC m=+2089.269806573" watchObservedRunningTime="2025-12-03 12:48:20.431422639 +0000 UTC m=+2089.276383690" Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.617342 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.618044 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.618110 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 12:48:20 crc kubenswrapper[4666]: I1203 12:48:20.620924 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.423204 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.450345 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.687156 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lf8px"] Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.689015 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.710955 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lf8px"] Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.774619 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.774707 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-config\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.774749 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.774767 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zv9n\" (UniqueName: \"kubernetes.io/projected/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-kube-api-access-5zv9n\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.774798 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-dns-svc\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.876615 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-dns-svc\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.876751 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.876821 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-config\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.876876 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.876901 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zv9n\" (UniqueName: \"kubernetes.io/projected/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-kube-api-access-5zv9n\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.878315 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-dns-svc\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.878934 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.879666 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-config\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.880283 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:21 crc kubenswrapper[4666]: I1203 12:48:21.900640 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zv9n\" (UniqueName: \"kubernetes.io/projected/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-kube-api-access-5zv9n\") pod \"dnsmasq-dns-5b856c5697-lf8px\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:22 crc kubenswrapper[4666]: I1203 12:48:22.045364 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:22 crc kubenswrapper[4666]: I1203 12:48:22.510772 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lf8px"] Dec 03 12:48:23 crc kubenswrapper[4666]: I1203 12:48:23.459808 4666 generic.go:334] "Generic (PLEG): container finished" podID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerID="258164d55eb35df9b31648e05d73a4e613d3c242e2e80c6ae1cf4d0584da030e" exitCode=0 Dec 03 12:48:23 crc kubenswrapper[4666]: I1203 12:48:23.459864 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" event={"ID":"c2ace9fc-00fe-4ec8-9f86-770e476d30e8","Type":"ContainerDied","Data":"258164d55eb35df9b31648e05d73a4e613d3c242e2e80c6ae1cf4d0584da030e"} Dec 03 12:48:23 crc kubenswrapper[4666]: I1203 12:48:23.460325 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" event={"ID":"c2ace9fc-00fe-4ec8-9f86-770e476d30e8","Type":"ContainerStarted","Data":"b66e8e3b675d2c5b34f1f896bdfe539b5ef61c26791a1e09b9edd88c130c73d4"} Dec 03 12:48:23 crc kubenswrapper[4666]: I1203 12:48:23.857129 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:23 crc kubenswrapper[4666]: I1203 12:48:23.891167 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 12:48:23 crc kubenswrapper[4666]: I1203 12:48:23.892379 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.021260 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.021570 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-central-agent" containerID="cri-o://9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36" gracePeriod=30 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.021651 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="sg-core" containerID="cri-o://c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc" gracePeriod=30 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.021683 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-notification-agent" containerID="cri-o://42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74" gracePeriod=30 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.021761 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="proxy-httpd" containerID="cri-o://c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c" gracePeriod=30 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.176061 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.502510 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" event={"ID":"c2ace9fc-00fe-4ec8-9f86-770e476d30e8","Type":"ContainerStarted","Data":"dee5259a64b28f0cbe97b94d789142f18a267184cbca8497b1ce9c399fd6f43a"} Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.503922 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.509993 4666 generic.go:334] "Generic (PLEG): container finished" podID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerID="c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c" exitCode=0 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.510025 4666 generic.go:334] "Generic (PLEG): container finished" podID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerID="c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc" exitCode=2 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.510037 4666 generic.go:334] "Generic (PLEG): container finished" podID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerID="9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36" exitCode=0 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.510346 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-log" containerID="cri-o://81f514f62bbee44890c51713efc64ca9e1440e88210594251a57709dc0fa07e9" gracePeriod=30 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.510624 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerDied","Data":"c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c"} Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.510653 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerDied","Data":"c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc"} Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.510663 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerDied","Data":"9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36"} Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.510722 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-api" containerID="cri-o://f8656f33884cc910982e58a86c1461e8b46a48f97ca902fed2f4a4ee012603f5" gracePeriod=30 Dec 03 12:48:24 crc kubenswrapper[4666]: I1203 12:48:24.522742 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" podStartSLOduration=3.5227161369999997 podStartE2EDuration="3.522716137s" podCreationTimestamp="2025-12-03 12:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:24.522517422 +0000 UTC m=+2093.367478503" watchObservedRunningTime="2025-12-03 12:48:24.522716137 +0000 UTC m=+2093.367677188" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.295459 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.362699 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-run-httpd\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.362744 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-log-httpd\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.362796 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-sg-core-conf-yaml\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.362821 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbd72\" (UniqueName: \"kubernetes.io/projected/26b800ef-0a58-47b1-ac57-ef8ce7b82285-kube-api-access-bbd72\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.362840 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-ceilometer-tls-certs\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.362890 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-combined-ca-bundle\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.362972 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-config-data\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.363017 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-scripts\") pod \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\" (UID: \"26b800ef-0a58-47b1-ac57-ef8ce7b82285\") " Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.363259 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.363353 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.363864 4666 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.363883 4666 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26b800ef-0a58-47b1-ac57-ef8ce7b82285-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.370383 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b800ef-0a58-47b1-ac57-ef8ce7b82285-kube-api-access-bbd72" (OuterVolumeSpecName: "kube-api-access-bbd72") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "kube-api-access-bbd72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.379584 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-scripts" (OuterVolumeSpecName: "scripts") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.403326 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.445356 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.452309 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.465618 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.465647 4666 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.465657 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbd72\" (UniqueName: \"kubernetes.io/projected/26b800ef-0a58-47b1-ac57-ef8ce7b82285-kube-api-access-bbd72\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.465667 4666 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.465675 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.477361 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-config-data" (OuterVolumeSpecName: "config-data") pod "26b800ef-0a58-47b1-ac57-ef8ce7b82285" (UID: "26b800ef-0a58-47b1-ac57-ef8ce7b82285"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.521453 4666 generic.go:334] "Generic (PLEG): container finished" podID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerID="81f514f62bbee44890c51713efc64ca9e1440e88210594251a57709dc0fa07e9" exitCode=143 Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.521493 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27e2b2a8-a5c9-4847-9116-f23895f6ef64","Type":"ContainerDied","Data":"81f514f62bbee44890c51713efc64ca9e1440e88210594251a57709dc0fa07e9"} Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.524255 4666 generic.go:334] "Generic (PLEG): container finished" podID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerID="42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74" exitCode=0 Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.524285 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerDied","Data":"42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74"} Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.524325 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26b800ef-0a58-47b1-ac57-ef8ce7b82285","Type":"ContainerDied","Data":"a38fbbfe870d0e30cb789245ef3a2d5355f91031978f74e822469a6f3793b7e1"} Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.524345 4666 scope.go:117] "RemoveContainer" containerID="c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.524361 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.545276 4666 scope.go:117] "RemoveContainer" containerID="c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.558561 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.566285 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.567289 4666 scope.go:117] "RemoveContainer" containerID="42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.572963 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b800ef-0a58-47b1-ac57-ef8ce7b82285-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.580566 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.581234 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="proxy-httpd" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.581322 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="proxy-httpd" Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.581438 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="sg-core" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.581507 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="sg-core" Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.581586 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-notification-agent" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.581660 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-notification-agent" Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.581755 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-central-agent" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.581823 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-central-agent" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.582044 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-central-agent" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.582139 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="sg-core" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.582218 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="ceilometer-notification-agent" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.582295 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" containerName="proxy-httpd" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.584556 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.589898 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.589985 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.590004 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.607276 4666 scope.go:117] "RemoveContainer" containerID="9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.608853 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.633596 4666 scope.go:117] "RemoveContainer" containerID="c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c" Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.633985 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c\": container with ID starting with c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c not found: ID does not exist" containerID="c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.634015 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c"} err="failed to get container status \"c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c\": rpc error: code = NotFound desc = could not find container \"c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c\": container with ID starting with c9f925cc433b1f8d5d32f2e0646403c4803f0226f5573900b529748c8bb8911c not found: ID does not exist" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.634045 4666 scope.go:117] "RemoveContainer" containerID="c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc" Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.634409 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc\": container with ID starting with c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc not found: ID does not exist" containerID="c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.634467 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc"} err="failed to get container status \"c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc\": rpc error: code = NotFound desc = could not find container \"c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc\": container with ID starting with c64371a42f78588cf1a42ba371cdff380c4dcd7a3f38ef737d3dbfa74dfda8fc not found: ID does not exist" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.634499 4666 scope.go:117] "RemoveContainer" containerID="42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74" Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.634775 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74\": container with ID starting with 42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74 not found: ID does not exist" containerID="42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.634800 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74"} err="failed to get container status \"42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74\": rpc error: code = NotFound desc = could not find container \"42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74\": container with ID starting with 42d1dc48b8ac0fe75fa0f24dd0ca848c90224bf47fe1a229f66dd66dba1f8e74 not found: ID does not exist" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.634816 4666 scope.go:117] "RemoveContainer" containerID="9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36" Dec 03 12:48:25 crc kubenswrapper[4666]: E1203 12:48:25.635053 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36\": container with ID starting with 9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36 not found: ID does not exist" containerID="9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.635103 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36"} err="failed to get container status \"9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36\": rpc error: code = NotFound desc = could not find container \"9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36\": container with ID starting with 9ac2cd8e26ea85f6912e0ce38d2773620dfdbd91774234f465dbf397ac1ccf36 not found: ID does not exist" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.674815 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-scripts\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.674897 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.674951 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.674982 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.675030 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cf9r\" (UniqueName: \"kubernetes.io/projected/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-kube-api-access-4cf9r\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.675063 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-config-data\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.675108 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-run-httpd\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.675130 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-log-httpd\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776505 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-scripts\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776597 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776656 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776693 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776733 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cf9r\" (UniqueName: \"kubernetes.io/projected/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-kube-api-access-4cf9r\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776767 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-config-data\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776800 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-run-httpd\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.776828 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-log-httpd\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.777503 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-log-httpd\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.778312 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-run-httpd\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.785229 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.785450 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.785580 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.786283 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-config-data\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.786413 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-scripts\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.802401 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cf9r\" (UniqueName: \"kubernetes.io/projected/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-kube-api-access-4cf9r\") pod \"ceilometer-0\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " pod="openstack/ceilometer-0" Dec 03 12:48:25 crc kubenswrapper[4666]: I1203 12:48:25.899053 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:48:26 crc kubenswrapper[4666]: I1203 12:48:26.270252 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:48:26 crc kubenswrapper[4666]: I1203 12:48:26.390618 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:48:26 crc kubenswrapper[4666]: W1203 12:48:26.392278 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2ed7f8_42cc_4266_a8c1_66873a5e0850.slice/crio-8cf4c55beaebc9d81f7d8107429fa0620a5a979a44adaa7300d908733ef36609 WatchSource:0}: Error finding container 8cf4c55beaebc9d81f7d8107429fa0620a5a979a44adaa7300d908733ef36609: Status 404 returned error can't find the container with id 8cf4c55beaebc9d81f7d8107429fa0620a5a979a44adaa7300d908733ef36609 Dec 03 12:48:26 crc kubenswrapper[4666]: I1203 12:48:26.535784 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerStarted","Data":"8cf4c55beaebc9d81f7d8107429fa0620a5a979a44adaa7300d908733ef36609"} Dec 03 12:48:27 crc kubenswrapper[4666]: I1203 12:48:27.209677 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:48:27 crc kubenswrapper[4666]: I1203 12:48:27.435743 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b800ef-0a58-47b1-ac57-ef8ce7b82285" path="/var/lib/kubelet/pods/26b800ef-0a58-47b1-ac57-ef8ce7b82285/volumes" Dec 03 12:48:27 crc kubenswrapper[4666]: I1203 12:48:27.547152 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerStarted","Data":"3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b"} Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.572640 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerStarted","Data":"a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1"} Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.575623 4666 generic.go:334] "Generic (PLEG): container finished" podID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerID="f8656f33884cc910982e58a86c1461e8b46a48f97ca902fed2f4a4ee012603f5" exitCode=0 Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.575672 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27e2b2a8-a5c9-4847-9116-f23895f6ef64","Type":"ContainerDied","Data":"f8656f33884cc910982e58a86c1461e8b46a48f97ca902fed2f4a4ee012603f5"} Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.575691 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27e2b2a8-a5c9-4847-9116-f23895f6ef64","Type":"ContainerDied","Data":"52cd25a174fe65a24d6889f7ba89762853a8a7c321d9509190b4636fb1148e2f"} Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.575704 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52cd25a174fe65a24d6889f7ba89762853a8a7c321d9509190b4636fb1148e2f" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.621978 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.728619 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft7k9\" (UniqueName: \"kubernetes.io/projected/27e2b2a8-a5c9-4847-9116-f23895f6ef64-kube-api-access-ft7k9\") pod \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.728684 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-combined-ca-bundle\") pod \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.728748 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27e2b2a8-a5c9-4847-9116-f23895f6ef64-logs\") pod \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.728824 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-config-data\") pod \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\" (UID: \"27e2b2a8-a5c9-4847-9116-f23895f6ef64\") " Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.729214 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e2b2a8-a5c9-4847-9116-f23895f6ef64-logs" (OuterVolumeSpecName: "logs") pod "27e2b2a8-a5c9-4847-9116-f23895f6ef64" (UID: "27e2b2a8-a5c9-4847-9116-f23895f6ef64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.734293 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e2b2a8-a5c9-4847-9116-f23895f6ef64-kube-api-access-ft7k9" (OuterVolumeSpecName: "kube-api-access-ft7k9") pod "27e2b2a8-a5c9-4847-9116-f23895f6ef64" (UID: "27e2b2a8-a5c9-4847-9116-f23895f6ef64"). InnerVolumeSpecName "kube-api-access-ft7k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.754469 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-config-data" (OuterVolumeSpecName: "config-data") pod "27e2b2a8-a5c9-4847-9116-f23895f6ef64" (UID: "27e2b2a8-a5c9-4847-9116-f23895f6ef64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.761824 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27e2b2a8-a5c9-4847-9116-f23895f6ef64" (UID: "27e2b2a8-a5c9-4847-9116-f23895f6ef64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.832371 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27e2b2a8-a5c9-4847-9116-f23895f6ef64-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.832673 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.832684 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft7k9\" (UniqueName: \"kubernetes.io/projected/27e2b2a8-a5c9-4847-9116-f23895f6ef64-kube-api-access-ft7k9\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.832694 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27e2b2a8-a5c9-4847-9116-f23895f6ef64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.858552 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.874479 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.890932 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 12:48:28 crc kubenswrapper[4666]: I1203 12:48:28.890998 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.589681 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerStarted","Data":"a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1"} Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.589766 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.607851 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.611783 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.618108 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.637650 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:29 crc kubenswrapper[4666]: E1203 12:48:29.638143 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-api" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.638159 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-api" Dec 03 12:48:29 crc kubenswrapper[4666]: E1203 12:48:29.638173 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-log" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.638179 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-log" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.638362 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-log" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.638388 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" containerName="nova-api-api" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.639342 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.641313 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.643011 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.645040 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.665418 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.747928 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88044cd5-3b59-4350-9e69-24163d57e43f-logs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.748152 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4dg7\" (UniqueName: \"kubernetes.io/projected/88044cd5-3b59-4350-9e69-24163d57e43f-kube-api-access-f4dg7\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.748419 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-public-tls-certs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.748561 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.748613 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-config-data\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.748648 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.840713 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-826qj"] Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.842082 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.847052 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.849147 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.850200 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.850252 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-config-data\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.850284 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.850336 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88044cd5-3b59-4350-9e69-24163d57e43f-logs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.850369 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4dg7\" (UniqueName: \"kubernetes.io/projected/88044cd5-3b59-4350-9e69-24163d57e43f-kube-api-access-f4dg7\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.850414 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-public-tls-certs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.850511 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-826qj"] Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.851211 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88044cd5-3b59-4350-9e69-24163d57e43f-logs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.855758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-config-data\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.857569 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.860804 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.871774 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-public-tls-certs\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.878659 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4dg7\" (UniqueName: \"kubernetes.io/projected/88044cd5-3b59-4350-9e69-24163d57e43f-kube-api-access-f4dg7\") pod \"nova-api-0\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " pod="openstack/nova-api-0" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.908314 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.908338 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.952682 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm6hv\" (UniqueName: \"kubernetes.io/projected/d49a6a9d-4720-4ff0-995e-612b636b2a92-kube-api-access-tm6hv\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.953049 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-scripts\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.953259 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-config-data\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.953435 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:29 crc kubenswrapper[4666]: I1203 12:48:29.956665 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.055235 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.055300 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm6hv\" (UniqueName: \"kubernetes.io/projected/d49a6a9d-4720-4ff0-995e-612b636b2a92-kube-api-access-tm6hv\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.055341 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-scripts\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.055401 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-config-data\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.059910 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-config-data\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.060767 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.062954 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-scripts\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.076943 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm6hv\" (UniqueName: \"kubernetes.io/projected/d49a6a9d-4720-4ff0-995e-612b636b2a92-kube-api-access-tm6hv\") pod \"nova-cell1-cell-mapping-826qj\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.241557 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.442342 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.619204 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88044cd5-3b59-4350-9e69-24163d57e43f","Type":"ContainerStarted","Data":"3bd9e26af1a270ad750d0d312b67d6400d9738abc3d1d9554506a79faa8febf3"} Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.630351 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerStarted","Data":"8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63"} Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.630397 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-central-agent" containerID="cri-o://3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b" gracePeriod=30 Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.630535 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="proxy-httpd" containerID="cri-o://8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63" gracePeriod=30 Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.630607 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="sg-core" containerID="cri-o://a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1" gracePeriod=30 Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.630647 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-notification-agent" containerID="cri-o://a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1" gracePeriod=30 Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.631078 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.656449 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.841193971 podStartE2EDuration="5.656431572s" podCreationTimestamp="2025-12-03 12:48:25 +0000 UTC" firstStartedPulling="2025-12-03 12:48:26.394064966 +0000 UTC m=+2095.239026017" lastFinishedPulling="2025-12-03 12:48:30.209302567 +0000 UTC m=+2099.054263618" observedRunningTime="2025-12-03 12:48:30.651048057 +0000 UTC m=+2099.496009118" watchObservedRunningTime="2025-12-03 12:48:30.656431572 +0000 UTC m=+2099.501392613" Dec 03 12:48:30 crc kubenswrapper[4666]: I1203 12:48:30.709338 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-826qj"] Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.496246 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e2b2a8-a5c9-4847-9116-f23895f6ef64" path="/var/lib/kubelet/pods/27e2b2a8-a5c9-4847-9116-f23895f6ef64/volumes" Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.638696 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88044cd5-3b59-4350-9e69-24163d57e43f","Type":"ContainerStarted","Data":"3940374f72ce7850abc7e36938d2eef536b0b10d74b8472cc0074664a42ae653"} Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.638746 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88044cd5-3b59-4350-9e69-24163d57e43f","Type":"ContainerStarted","Data":"5b5a3b7f735f52cb1f3b10ba8c4786318660139dde4db2b204ef5afffa730f35"} Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.642463 4666 generic.go:334] "Generic (PLEG): container finished" podID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerID="a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1" exitCode=2 Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.642502 4666 generic.go:334] "Generic (PLEG): container finished" podID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerID="a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1" exitCode=0 Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.642504 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerDied","Data":"a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1"} Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.642548 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerDied","Data":"a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1"} Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.644297 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-826qj" event={"ID":"d49a6a9d-4720-4ff0-995e-612b636b2a92","Type":"ContainerStarted","Data":"e664b29f6d0b1942a9718c8c47cbc4cf3fb8e248819376147a5ea8417a67d422"} Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.644324 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-826qj" event={"ID":"d49a6a9d-4720-4ff0-995e-612b636b2a92","Type":"ContainerStarted","Data":"46d75cfa35e968355c01c818d08c59285d9bcde4e003e34a502b2f4eb243d4a8"} Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.673720 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.673698405 podStartE2EDuration="2.673698405s" podCreationTimestamp="2025-12-03 12:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:31.660306684 +0000 UTC m=+2100.505267735" watchObservedRunningTime="2025-12-03 12:48:31.673698405 +0000 UTC m=+2100.518659456" Dec 03 12:48:31 crc kubenswrapper[4666]: I1203 12:48:31.683448 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-826qj" podStartSLOduration=2.6834334269999998 podStartE2EDuration="2.683433427s" podCreationTimestamp="2025-12-03 12:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:31.677032374 +0000 UTC m=+2100.521993435" watchObservedRunningTime="2025-12-03 12:48:31.683433427 +0000 UTC m=+2100.528394478" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.047265 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.134487 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-c9jvt"] Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.134729 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerName="dnsmasq-dns" containerID="cri-o://1724b4cce6e821b8de73dc7074f9b90ddd03803628e7914a2bd8d94ac6fe276a" gracePeriod=10 Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.663414 4666 generic.go:334] "Generic (PLEG): container finished" podID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerID="1724b4cce6e821b8de73dc7074f9b90ddd03803628e7914a2bd8d94ac6fe276a" exitCode=0 Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.664465 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" event={"ID":"ab129ab9-27f4-43c5-aca6-397236fb03c1","Type":"ContainerDied","Data":"1724b4cce6e821b8de73dc7074f9b90ddd03803628e7914a2bd8d94ac6fe276a"} Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.664491 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" event={"ID":"ab129ab9-27f4-43c5-aca6-397236fb03c1","Type":"ContainerDied","Data":"0a793764d03de8e2647aea7963c4111b72379405645c43984f28a46469d700de"} Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.664516 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a793764d03de8e2647aea7963c4111b72379405645c43984f28a46469d700de" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.687161 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.806030 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-sb\") pod \"ab129ab9-27f4-43c5-aca6-397236fb03c1\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.806079 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-config\") pod \"ab129ab9-27f4-43c5-aca6-397236fb03c1\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.806249 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4b29\" (UniqueName: \"kubernetes.io/projected/ab129ab9-27f4-43c5-aca6-397236fb03c1-kube-api-access-c4b29\") pod \"ab129ab9-27f4-43c5-aca6-397236fb03c1\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.806323 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-dns-svc\") pod \"ab129ab9-27f4-43c5-aca6-397236fb03c1\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.806412 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-nb\") pod \"ab129ab9-27f4-43c5-aca6-397236fb03c1\" (UID: \"ab129ab9-27f4-43c5-aca6-397236fb03c1\") " Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.820569 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab129ab9-27f4-43c5-aca6-397236fb03c1-kube-api-access-c4b29" (OuterVolumeSpecName: "kube-api-access-c4b29") pod "ab129ab9-27f4-43c5-aca6-397236fb03c1" (UID: "ab129ab9-27f4-43c5-aca6-397236fb03c1"). InnerVolumeSpecName "kube-api-access-c4b29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.863827 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab129ab9-27f4-43c5-aca6-397236fb03c1" (UID: "ab129ab9-27f4-43c5-aca6-397236fb03c1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.869499 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab129ab9-27f4-43c5-aca6-397236fb03c1" (UID: "ab129ab9-27f4-43c5-aca6-397236fb03c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.875281 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-config" (OuterVolumeSpecName: "config") pod "ab129ab9-27f4-43c5-aca6-397236fb03c1" (UID: "ab129ab9-27f4-43c5-aca6-397236fb03c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.885995 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ab129ab9-27f4-43c5-aca6-397236fb03c1" (UID: "ab129ab9-27f4-43c5-aca6-397236fb03c1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.911468 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.911504 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.911515 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.911524 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab129ab9-27f4-43c5-aca6-397236fb03c1-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:32 crc kubenswrapper[4666]: I1203 12:48:32.911533 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4b29\" (UniqueName: \"kubernetes.io/projected/ab129ab9-27f4-43c5-aca6-397236fb03c1-kube-api-access-c4b29\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:33 crc kubenswrapper[4666]: I1203 12:48:33.674135 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerDied","Data":"3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b"} Dec 03 12:48:33 crc kubenswrapper[4666]: I1203 12:48:33.674077 4666 generic.go:334] "Generic (PLEG): container finished" podID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerID="3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b" exitCode=0 Dec 03 12:48:33 crc kubenswrapper[4666]: I1203 12:48:33.674276 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" Dec 03 12:48:33 crc kubenswrapper[4666]: I1203 12:48:33.694377 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-c9jvt"] Dec 03 12:48:33 crc kubenswrapper[4666]: I1203 12:48:33.705023 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-c9jvt"] Dec 03 12:48:35 crc kubenswrapper[4666]: I1203 12:48:35.434110 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" path="/var/lib/kubelet/pods/ab129ab9-27f4-43c5-aca6-397236fb03c1/volumes" Dec 03 12:48:35 crc kubenswrapper[4666]: I1203 12:48:35.697134 4666 generic.go:334] "Generic (PLEG): container finished" podID="d49a6a9d-4720-4ff0-995e-612b636b2a92" containerID="e664b29f6d0b1942a9718c8c47cbc4cf3fb8e248819376147a5ea8417a67d422" exitCode=0 Dec 03 12:48:35 crc kubenswrapper[4666]: I1203 12:48:35.697186 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-826qj" event={"ID":"d49a6a9d-4720-4ff0-995e-612b636b2a92","Type":"ContainerDied","Data":"e664b29f6d0b1942a9718c8c47cbc4cf3fb8e248819376147a5ea8417a67d422"} Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.237727 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zxcr"] Dec 03 12:48:36 crc kubenswrapper[4666]: E1203 12:48:36.238807 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerName="init" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.238827 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerName="init" Dec 03 12:48:36 crc kubenswrapper[4666]: E1203 12:48:36.238850 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerName="dnsmasq-dns" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.238856 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerName="dnsmasq-dns" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.239036 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerName="dnsmasq-dns" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.246576 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.252461 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zxcr"] Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.378202 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlh9d\" (UniqueName: \"kubernetes.io/projected/5ff97f24-07fe-4314-a483-b39299527235-kube-api-access-nlh9d\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.378332 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-catalog-content\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.378369 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-utilities\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.480326 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlh9d\" (UniqueName: \"kubernetes.io/projected/5ff97f24-07fe-4314-a483-b39299527235-kube-api-access-nlh9d\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.480481 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-catalog-content\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.480521 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-utilities\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.481654 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-catalog-content\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.481764 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-utilities\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.510100 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlh9d\" (UniqueName: \"kubernetes.io/projected/5ff97f24-07fe-4314-a483-b39299527235-kube-api-access-nlh9d\") pod \"certified-operators-7zxcr\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:36 crc kubenswrapper[4666]: I1203 12:48:36.572566 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.161554 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.265736 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zxcr"] Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.301787 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-config-data\") pod \"d49a6a9d-4720-4ff0-995e-612b636b2a92\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.302608 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-scripts\") pod \"d49a6a9d-4720-4ff0-995e-612b636b2a92\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.302957 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-combined-ca-bundle\") pod \"d49a6a9d-4720-4ff0-995e-612b636b2a92\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.303005 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm6hv\" (UniqueName: \"kubernetes.io/projected/d49a6a9d-4720-4ff0-995e-612b636b2a92-kube-api-access-tm6hv\") pod \"d49a6a9d-4720-4ff0-995e-612b636b2a92\" (UID: \"d49a6a9d-4720-4ff0-995e-612b636b2a92\") " Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.307403 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49a6a9d-4720-4ff0-995e-612b636b2a92-kube-api-access-tm6hv" (OuterVolumeSpecName: "kube-api-access-tm6hv") pod "d49a6a9d-4720-4ff0-995e-612b636b2a92" (UID: "d49a6a9d-4720-4ff0-995e-612b636b2a92"). InnerVolumeSpecName "kube-api-access-tm6hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.310827 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-scripts" (OuterVolumeSpecName: "scripts") pod "d49a6a9d-4720-4ff0-995e-612b636b2a92" (UID: "d49a6a9d-4720-4ff0-995e-612b636b2a92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.328260 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-config-data" (OuterVolumeSpecName: "config-data") pod "d49a6a9d-4720-4ff0-995e-612b636b2a92" (UID: "d49a6a9d-4720-4ff0-995e-612b636b2a92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.348621 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d49a6a9d-4720-4ff0-995e-612b636b2a92" (UID: "d49a6a9d-4720-4ff0-995e-612b636b2a92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.405136 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.405165 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.405177 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49a6a9d-4720-4ff0-995e-612b636b2a92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.405189 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm6hv\" (UniqueName: \"kubernetes.io/projected/d49a6a9d-4720-4ff0-995e-612b636b2a92-kube-api-access-tm6hv\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.576066 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-566b5b7845-c9jvt" podUID="ab129ab9-27f4-43c5-aca6-397236fb03c1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: i/o timeout" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.720398 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-826qj" event={"ID":"d49a6a9d-4720-4ff0-995e-612b636b2a92","Type":"ContainerDied","Data":"46d75cfa35e968355c01c818d08c59285d9bcde4e003e34a502b2f4eb243d4a8"} Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.720463 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d75cfa35e968355c01c818d08c59285d9bcde4e003e34a502b2f4eb243d4a8" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.720663 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-826qj" Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.722611 4666 generic.go:334] "Generic (PLEG): container finished" podID="5ff97f24-07fe-4314-a483-b39299527235" containerID="83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe" exitCode=0 Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.722673 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zxcr" event={"ID":"5ff97f24-07fe-4314-a483-b39299527235","Type":"ContainerDied","Data":"83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe"} Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.722733 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zxcr" event={"ID":"5ff97f24-07fe-4314-a483-b39299527235","Type":"ContainerStarted","Data":"f39e4254930a5f3aa71a17ce280a40e82c83d8180bcf33e4146c2f934d96fe5f"} Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.953721 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.954106 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-log" containerID="cri-o://5b5a3b7f735f52cb1f3b10ba8c4786318660139dde4db2b204ef5afffa730f35" gracePeriod=30 Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.954223 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-api" containerID="cri-o://3940374f72ce7850abc7e36938d2eef536b0b10d74b8472cc0074664a42ae653" gracePeriod=30 Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.973218 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.973609 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="67e03c9f-9729-44b8-954e-128a4829d47c" containerName="nova-scheduler-scheduler" containerID="cri-o://eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0" gracePeriod=30 Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.991241 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.991907 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-log" containerID="cri-o://1434cec8a98ca76f74d5b92296a4fe1e43e4250f6aa8a0e51ef625e08cbf13f0" gracePeriod=30 Dec 03 12:48:37 crc kubenswrapper[4666]: I1203 12:48:37.992052 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-metadata" containerID="cri-o://448b08431b3789595853f49f578055b843862d54ca093a70196b97a47e0f68d6" gracePeriod=30 Dec 03 12:48:38 crc kubenswrapper[4666]: E1203 12:48:38.588639 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 12:48:38 crc kubenswrapper[4666]: E1203 12:48:38.590058 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 12:48:38 crc kubenswrapper[4666]: E1203 12:48:38.591389 4666 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 12:48:38 crc kubenswrapper[4666]: E1203 12:48:38.591464 4666 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="67e03c9f-9729-44b8-954e-128a4829d47c" containerName="nova-scheduler-scheduler" Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.762380 4666 generic.go:334] "Generic (PLEG): container finished" podID="88044cd5-3b59-4350-9e69-24163d57e43f" containerID="3940374f72ce7850abc7e36938d2eef536b0b10d74b8472cc0074664a42ae653" exitCode=0 Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.763010 4666 generic.go:334] "Generic (PLEG): container finished" podID="88044cd5-3b59-4350-9e69-24163d57e43f" containerID="5b5a3b7f735f52cb1f3b10ba8c4786318660139dde4db2b204ef5afffa730f35" exitCode=143 Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.763354 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88044cd5-3b59-4350-9e69-24163d57e43f","Type":"ContainerDied","Data":"3940374f72ce7850abc7e36938d2eef536b0b10d74b8472cc0074664a42ae653"} Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.763411 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88044cd5-3b59-4350-9e69-24163d57e43f","Type":"ContainerDied","Data":"5b5a3b7f735f52cb1f3b10ba8c4786318660139dde4db2b204ef5afffa730f35"} Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.774891 4666 generic.go:334] "Generic (PLEG): container finished" podID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerID="448b08431b3789595853f49f578055b843862d54ca093a70196b97a47e0f68d6" exitCode=0 Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.774924 4666 generic.go:334] "Generic (PLEG): container finished" podID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerID="1434cec8a98ca76f74d5b92296a4fe1e43e4250f6aa8a0e51ef625e08cbf13f0" exitCode=143 Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.774949 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7c5eed6-fb78-4116-87c2-83fb558fe6d3","Type":"ContainerDied","Data":"448b08431b3789595853f49f578055b843862d54ca093a70196b97a47e0f68d6"} Dec 03 12:48:41 crc kubenswrapper[4666]: I1203 12:48:41.774978 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7c5eed6-fb78-4116-87c2-83fb558fe6d3","Type":"ContainerDied","Data":"1434cec8a98ca76f74d5b92296a4fe1e43e4250f6aa8a0e51ef625e08cbf13f0"} Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.434482 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.438844 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.501836 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-nova-metadata-tls-certs\") pod \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.501906 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4dg7\" (UniqueName: \"kubernetes.io/projected/88044cd5-3b59-4350-9e69-24163d57e43f-kube-api-access-f4dg7\") pod \"88044cd5-3b59-4350-9e69-24163d57e43f\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.501959 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-internal-tls-certs\") pod \"88044cd5-3b59-4350-9e69-24163d57e43f\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502012 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-combined-ca-bundle\") pod \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502081 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-combined-ca-bundle\") pod \"88044cd5-3b59-4350-9e69-24163d57e43f\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502124 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nc5v\" (UniqueName: \"kubernetes.io/projected/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-kube-api-access-7nc5v\") pod \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502183 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-config-data\") pod \"88044cd5-3b59-4350-9e69-24163d57e43f\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502207 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-logs\") pod \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502237 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-config-data\") pod \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\" (UID: \"e7c5eed6-fb78-4116-87c2-83fb558fe6d3\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502344 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-public-tls-certs\") pod \"88044cd5-3b59-4350-9e69-24163d57e43f\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.502378 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88044cd5-3b59-4350-9e69-24163d57e43f-logs\") pod \"88044cd5-3b59-4350-9e69-24163d57e43f\" (UID: \"88044cd5-3b59-4350-9e69-24163d57e43f\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.503886 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-logs" (OuterVolumeSpecName: "logs") pod "e7c5eed6-fb78-4116-87c2-83fb558fe6d3" (UID: "e7c5eed6-fb78-4116-87c2-83fb558fe6d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.505571 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88044cd5-3b59-4350-9e69-24163d57e43f-logs" (OuterVolumeSpecName: "logs") pod "88044cd5-3b59-4350-9e69-24163d57e43f" (UID: "88044cd5-3b59-4350-9e69-24163d57e43f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.511586 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88044cd5-3b59-4350-9e69-24163d57e43f-kube-api-access-f4dg7" (OuterVolumeSpecName: "kube-api-access-f4dg7") pod "88044cd5-3b59-4350-9e69-24163d57e43f" (UID: "88044cd5-3b59-4350-9e69-24163d57e43f"). InnerVolumeSpecName "kube-api-access-f4dg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.530745 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-kube-api-access-7nc5v" (OuterVolumeSpecName: "kube-api-access-7nc5v") pod "e7c5eed6-fb78-4116-87c2-83fb558fe6d3" (UID: "e7c5eed6-fb78-4116-87c2-83fb558fe6d3"). InnerVolumeSpecName "kube-api-access-7nc5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.532029 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88044cd5-3b59-4350-9e69-24163d57e43f" (UID: "88044cd5-3b59-4350-9e69-24163d57e43f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.542933 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-config-data" (OuterVolumeSpecName: "config-data") pod "88044cd5-3b59-4350-9e69-24163d57e43f" (UID: "88044cd5-3b59-4350-9e69-24163d57e43f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.543267 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7c5eed6-fb78-4116-87c2-83fb558fe6d3" (UID: "e7c5eed6-fb78-4116-87c2-83fb558fe6d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.552429 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-config-data" (OuterVolumeSpecName: "config-data") pod "e7c5eed6-fb78-4116-87c2-83fb558fe6d3" (UID: "e7c5eed6-fb78-4116-87c2-83fb558fe6d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.572776 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e7c5eed6-fb78-4116-87c2-83fb558fe6d3" (UID: "e7c5eed6-fb78-4116-87c2-83fb558fe6d3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.577190 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88044cd5-3b59-4350-9e69-24163d57e43f" (UID: "88044cd5-3b59-4350-9e69-24163d57e43f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.581796 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88044cd5-3b59-4350-9e69-24163d57e43f" (UID: "88044cd5-3b59-4350-9e69-24163d57e43f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603837 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603868 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603877 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603885 4666 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603898 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88044cd5-3b59-4350-9e69-24163d57e43f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603907 4666 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603917 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4dg7\" (UniqueName: \"kubernetes.io/projected/88044cd5-3b59-4350-9e69-24163d57e43f-kube-api-access-f4dg7\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603927 4666 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603935 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603944 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88044cd5-3b59-4350-9e69-24163d57e43f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.603953 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nc5v\" (UniqueName: \"kubernetes.io/projected/e7c5eed6-fb78-4116-87c2-83fb558fe6d3-kube-api-access-7nc5v\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.783946 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7c5eed6-fb78-4116-87c2-83fb558fe6d3","Type":"ContainerDied","Data":"a019fb2c5f7d4b4326dceaf8c4de1a6e8fafca34e5324442319fcd6c71ae872d"} Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.783991 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.784047 4666 scope.go:117] "RemoveContainer" containerID="448b08431b3789595853f49f578055b843862d54ca093a70196b97a47e0f68d6" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.785741 4666 generic.go:334] "Generic (PLEG): container finished" podID="67e03c9f-9729-44b8-954e-128a4829d47c" containerID="eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0" exitCode=0 Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.785793 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67e03c9f-9729-44b8-954e-128a4829d47c","Type":"ContainerDied","Data":"eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0"} Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.785838 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67e03c9f-9729-44b8-954e-128a4829d47c","Type":"ContainerDied","Data":"0f4b7b81761250524ce6d92e438e1e23dca11e8b396346204e72a1dd0d10d495"} Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.785852 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f4b7b81761250524ce6d92e438e1e23dca11e8b396346204e72a1dd0d10d495" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.788290 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88044cd5-3b59-4350-9e69-24163d57e43f","Type":"ContainerDied","Data":"3bd9e26af1a270ad750d0d312b67d6400d9738abc3d1d9554506a79faa8febf3"} Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.788364 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.790254 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.795102 4666 generic.go:334] "Generic (PLEG): container finished" podID="5ff97f24-07fe-4314-a483-b39299527235" containerID="516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f" exitCode=0 Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.795142 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zxcr" event={"ID":"5ff97f24-07fe-4314-a483-b39299527235","Type":"ContainerDied","Data":"516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f"} Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.806678 4666 scope.go:117] "RemoveContainer" containerID="1434cec8a98ca76f74d5b92296a4fe1e43e4250f6aa8a0e51ef625e08cbf13f0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.847136 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.860837 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.884976 4666 scope.go:117] "RemoveContainer" containerID="3940374f72ce7850abc7e36938d2eef536b0b10d74b8472cc0074664a42ae653" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.887287 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:42 crc kubenswrapper[4666]: E1203 12:48:42.887730 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-log" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.887753 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-log" Dec 03 12:48:42 crc kubenswrapper[4666]: E1203 12:48:42.887767 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-api" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.887774 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-api" Dec 03 12:48:42 crc kubenswrapper[4666]: E1203 12:48:42.887792 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-metadata" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.887798 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-metadata" Dec 03 12:48:42 crc kubenswrapper[4666]: E1203 12:48:42.887813 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-log" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.887819 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-log" Dec 03 12:48:42 crc kubenswrapper[4666]: E1203 12:48:42.887832 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49a6a9d-4720-4ff0-995e-612b636b2a92" containerName="nova-manage" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.887837 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49a6a9d-4720-4ff0-995e-612b636b2a92" containerName="nova-manage" Dec 03 12:48:42 crc kubenswrapper[4666]: E1203 12:48:42.887852 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e03c9f-9729-44b8-954e-128a4829d47c" containerName="nova-scheduler-scheduler" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.887858 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e03c9f-9729-44b8-954e-128a4829d47c" containerName="nova-scheduler-scheduler" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.888024 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49a6a9d-4720-4ff0-995e-612b636b2a92" containerName="nova-manage" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.888037 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-api" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.888052 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e03c9f-9729-44b8-954e-128a4829d47c" containerName="nova-scheduler-scheduler" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.888062 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-metadata" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.888071 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" containerName="nova-metadata-log" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.888083 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" containerName="nova-api-log" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.889048 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.892267 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.892575 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.908038 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-combined-ca-bundle\") pod \"67e03c9f-9729-44b8-954e-128a4829d47c\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.908234 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-config-data\") pod \"67e03c9f-9729-44b8-954e-128a4829d47c\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.908268 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tccf2\" (UniqueName: \"kubernetes.io/projected/67e03c9f-9729-44b8-954e-128a4829d47c-kube-api-access-tccf2\") pod \"67e03c9f-9729-44b8-954e-128a4829d47c\" (UID: \"67e03c9f-9729-44b8-954e-128a4829d47c\") " Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.913963 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.917681 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e03c9f-9729-44b8-954e-128a4829d47c-kube-api-access-tccf2" (OuterVolumeSpecName: "kube-api-access-tccf2") pod "67e03c9f-9729-44b8-954e-128a4829d47c" (UID: "67e03c9f-9729-44b8-954e-128a4829d47c"). InnerVolumeSpecName "kube-api-access-tccf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.931050 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.932187 4666 scope.go:117] "RemoveContainer" containerID="5b5a3b7f735f52cb1f3b10ba8c4786318660139dde4db2b204ef5afffa730f35" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.940991 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.947590 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67e03c9f-9729-44b8-954e-128a4829d47c" (UID: "67e03c9f-9729-44b8-954e-128a4829d47c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.948641 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.949273 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-config-data" (OuterVolumeSpecName: "config-data") pod "67e03c9f-9729-44b8-954e-128a4829d47c" (UID: "67e03c9f-9729-44b8-954e-128a4829d47c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.950257 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.952059 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.953294 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.953959 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 12:48:42 crc kubenswrapper[4666]: I1203 12:48:42.972554 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.010524 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.010583 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-logs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.010805 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-config-data\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.010914 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.010984 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011045 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lnnm\" (UniqueName: \"kubernetes.io/projected/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-kube-api-access-2lnnm\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011309 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011347 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae0d3e5-4249-417a-aac2-5280115b1213-logs\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011416 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzjng\" (UniqueName: \"kubernetes.io/projected/8ae0d3e5-4249-417a-aac2-5280115b1213-kube-api-access-rzjng\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011471 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-config-data\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011502 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-public-tls-certs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011727 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011748 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tccf2\" (UniqueName: \"kubernetes.io/projected/67e03c9f-9729-44b8-954e-128a4829d47c-kube-api-access-tccf2\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.011764 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e03c9f-9729-44b8-954e-128a4829d47c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113389 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113446 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lnnm\" (UniqueName: \"kubernetes.io/projected/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-kube-api-access-2lnnm\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113496 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113523 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae0d3e5-4249-417a-aac2-5280115b1213-logs\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113569 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzjng\" (UniqueName: \"kubernetes.io/projected/8ae0d3e5-4249-417a-aac2-5280115b1213-kube-api-access-rzjng\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113594 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-config-data\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113618 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-public-tls-certs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113710 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113736 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-logs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113796 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-config-data\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.113831 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.114113 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae0d3e5-4249-417a-aac2-5280115b1213-logs\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.114638 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-logs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.123277 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-config-data\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.124115 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-config-data\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.124136 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.124156 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.124188 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ae0d3e5-4249-417a-aac2-5280115b1213-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.124232 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.124663 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-public-tls-certs\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.131823 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzjng\" (UniqueName: \"kubernetes.io/projected/8ae0d3e5-4249-417a-aac2-5280115b1213-kube-api-access-rzjng\") pod \"nova-metadata-0\" (UID: \"8ae0d3e5-4249-417a-aac2-5280115b1213\") " pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.136439 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lnnm\" (UniqueName: \"kubernetes.io/projected/d91a5463-a0cd-40be-90b0-e01d8f1ebdf3-kube-api-access-2lnnm\") pod \"nova-api-0\" (UID: \"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3\") " pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.218656 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.269071 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.441663 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88044cd5-3b59-4350-9e69-24163d57e43f" path="/var/lib/kubelet/pods/88044cd5-3b59-4350-9e69-24163d57e43f/volumes" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.442396 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c5eed6-fb78-4116-87c2-83fb558fe6d3" path="/var/lib/kubelet/pods/e7c5eed6-fb78-4116-87c2-83fb558fe6d3/volumes" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.684632 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.769443 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.809974 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ae0d3e5-4249-417a-aac2-5280115b1213","Type":"ContainerStarted","Data":"d23b88d15a8a85a2662755aaf923377295ec00381ad467cbaadc13cbbe36bbd6"} Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.819686 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zxcr" event={"ID":"5ff97f24-07fe-4314-a483-b39299527235","Type":"ContainerStarted","Data":"30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633"} Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.822859 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fg8p9"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.825459 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.829123 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.829302 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3","Type":"ContainerStarted","Data":"544285ed3da317bb49ddbf3c8a9777a248e1412fcb809b7baf1f028d01d6027c"} Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.854218 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fg8p9"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.868104 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zxcr" podStartSLOduration=2.06691082 podStartE2EDuration="7.868074197s" podCreationTimestamp="2025-12-03 12:48:36 +0000 UTC" firstStartedPulling="2025-12-03 12:48:37.724489526 +0000 UTC m=+2106.569450577" lastFinishedPulling="2025-12-03 12:48:43.525652903 +0000 UTC m=+2112.370613954" observedRunningTime="2025-12-03 12:48:43.851612254 +0000 UTC m=+2112.696573305" watchObservedRunningTime="2025-12-03 12:48:43.868074197 +0000 UTC m=+2112.713035248" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.928353 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.929527 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6s8\" (UniqueName: \"kubernetes.io/projected/38838880-d425-4c9b-83a0-9585da2094a1-kube-api-access-ml6s8\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.929623 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-catalog-content\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.929684 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-utilities\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.933876 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.942304 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.943622 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.946728 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 12:48:43 crc kubenswrapper[4666]: I1203 12:48:43.962867 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032139 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffk9\" (UniqueName: \"kubernetes.io/projected/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-kube-api-access-8ffk9\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032227 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6s8\" (UniqueName: \"kubernetes.io/projected/38838880-d425-4c9b-83a0-9585da2094a1-kube-api-access-ml6s8\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032276 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-catalog-content\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032304 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032321 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-config-data\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032349 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-utilities\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032932 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-utilities\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.032946 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-catalog-content\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.049758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6s8\" (UniqueName: \"kubernetes.io/projected/38838880-d425-4c9b-83a0-9585da2094a1-kube-api-access-ml6s8\") pod \"community-operators-fg8p9\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.134520 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.135120 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-config-data\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.135714 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffk9\" (UniqueName: \"kubernetes.io/projected/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-kube-api-access-8ffk9\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.141626 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-config-data\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.142000 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.152610 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffk9\" (UniqueName: \"kubernetes.io/projected/2a6eed91-42df-4c6b-aaa0-7882ecfb941a-kube-api-access-8ffk9\") pod \"nova-scheduler-0\" (UID: \"2a6eed91-42df-4c6b-aaa0-7882ecfb941a\") " pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.163758 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.261520 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.705346 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fg8p9"] Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.841931 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3","Type":"ContainerStarted","Data":"d4bc41e4ccde3e1c381def45676b95334c9d25e45d4a74a1e1077adeb7fb262b"} Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.841973 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d91a5463-a0cd-40be-90b0-e01d8f1ebdf3","Type":"ContainerStarted","Data":"9ed0f5f61458cf51f16c73e3eaa72932b1a3a0f76e5c9e17ae096ee634999ab3"} Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.844514 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ae0d3e5-4249-417a-aac2-5280115b1213","Type":"ContainerStarted","Data":"1d1f938e426d3537055099110a58e6cf71aeeb7ebcbdf11cc6335bd3d23d6bbc"} Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.844544 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ae0d3e5-4249-417a-aac2-5280115b1213","Type":"ContainerStarted","Data":"309580920f26e3f8f2604f899f68a41d8f16403ec54ae0e293c8fe548ec9e0b1"} Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.853032 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fg8p9" event={"ID":"38838880-d425-4c9b-83a0-9585da2094a1","Type":"ContainerStarted","Data":"681f31882f768d2b0fe61c8f1a02f1f7bcc12bf1718cddba6ce0a8823addca2a"} Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.904510 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.916573 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.91654975 podStartE2EDuration="2.91654975s" podCreationTimestamp="2025-12-03 12:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:44.891965697 +0000 UTC m=+2113.736926758" watchObservedRunningTime="2025-12-03 12:48:44.91654975 +0000 UTC m=+2113.761510801" Dec 03 12:48:44 crc kubenswrapper[4666]: I1203 12:48:44.941670 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.941650446 podStartE2EDuration="2.941650446s" podCreationTimestamp="2025-12-03 12:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:44.915124131 +0000 UTC m=+2113.760085182" watchObservedRunningTime="2025-12-03 12:48:44.941650446 +0000 UTC m=+2113.786611517" Dec 03 12:48:45 crc kubenswrapper[4666]: I1203 12:48:45.438338 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e03c9f-9729-44b8-954e-128a4829d47c" path="/var/lib/kubelet/pods/67e03c9f-9729-44b8-954e-128a4829d47c/volumes" Dec 03 12:48:45 crc kubenswrapper[4666]: I1203 12:48:45.864902 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a6eed91-42df-4c6b-aaa0-7882ecfb941a","Type":"ContainerStarted","Data":"8384b7b9cc33f3f95f5adc0a0fd119aa4853085c25e14966fb7c38264d692518"} Dec 03 12:48:45 crc kubenswrapper[4666]: I1203 12:48:45.864964 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a6eed91-42df-4c6b-aaa0-7882ecfb941a","Type":"ContainerStarted","Data":"b38f204a9c45e591d0dff82e195d972edbc6b8a50d2e1fb717002f073d875990"} Dec 03 12:48:45 crc kubenswrapper[4666]: I1203 12:48:45.867346 4666 generic.go:334] "Generic (PLEG): container finished" podID="38838880-d425-4c9b-83a0-9585da2094a1" containerID="9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d" exitCode=0 Dec 03 12:48:45 crc kubenswrapper[4666]: I1203 12:48:45.867429 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fg8p9" event={"ID":"38838880-d425-4c9b-83a0-9585da2094a1","Type":"ContainerDied","Data":"9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d"} Dec 03 12:48:45 crc kubenswrapper[4666]: I1203 12:48:45.893759 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.893735303 podStartE2EDuration="2.893735303s" podCreationTimestamp="2025-12-03 12:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:48:45.886977061 +0000 UTC m=+2114.731938152" watchObservedRunningTime="2025-12-03 12:48:45.893735303 +0000 UTC m=+2114.738696364" Dec 03 12:48:46 crc kubenswrapper[4666]: I1203 12:48:46.573335 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:46 crc kubenswrapper[4666]: I1203 12:48:46.573412 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:46 crc kubenswrapper[4666]: I1203 12:48:46.625187 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:46 crc kubenswrapper[4666]: I1203 12:48:46.878933 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fg8p9" event={"ID":"38838880-d425-4c9b-83a0-9585da2094a1","Type":"ContainerStarted","Data":"cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a"} Dec 03 12:48:47 crc kubenswrapper[4666]: I1203 12:48:47.891321 4666 generic.go:334] "Generic (PLEG): container finished" podID="38838880-d425-4c9b-83a0-9585da2094a1" containerID="cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a" exitCode=0 Dec 03 12:48:47 crc kubenswrapper[4666]: I1203 12:48:47.891364 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fg8p9" event={"ID":"38838880-d425-4c9b-83a0-9585da2094a1","Type":"ContainerDied","Data":"cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a"} Dec 03 12:48:48 crc kubenswrapper[4666]: I1203 12:48:48.219675 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 12:48:48 crc kubenswrapper[4666]: I1203 12:48:48.219736 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 12:48:48 crc kubenswrapper[4666]: I1203 12:48:48.903547 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fg8p9" event={"ID":"38838880-d425-4c9b-83a0-9585da2094a1","Type":"ContainerStarted","Data":"f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd"} Dec 03 12:48:48 crc kubenswrapper[4666]: I1203 12:48:48.931302 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fg8p9" podStartSLOduration=3.282706781 podStartE2EDuration="5.931285646s" podCreationTimestamp="2025-12-03 12:48:43 +0000 UTC" firstStartedPulling="2025-12-03 12:48:45.869238983 +0000 UTC m=+2114.714200034" lastFinishedPulling="2025-12-03 12:48:48.517817848 +0000 UTC m=+2117.362778899" observedRunningTime="2025-12-03 12:48:48.92399683 +0000 UTC m=+2117.768957881" watchObservedRunningTime="2025-12-03 12:48:48.931285646 +0000 UTC m=+2117.776246697" Dec 03 12:48:49 crc kubenswrapper[4666]: I1203 12:48:49.262635 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 12:48:53 crc kubenswrapper[4666]: I1203 12:48:53.218983 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 12:48:53 crc kubenswrapper[4666]: I1203 12:48:53.222058 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 12:48:53 crc kubenswrapper[4666]: I1203 12:48:53.270322 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 12:48:53 crc kubenswrapper[4666]: I1203 12:48:53.270390 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.164223 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.164257 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.214521 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.236333 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8ae0d3e5-4249-417a-aac2-5280115b1213" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.236356 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8ae0d3e5-4249-417a-aac2-5280115b1213" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.183:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.261785 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.285320 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d91a5463-a0cd-40be-90b0-e01d8f1ebdf3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.184:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.285375 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d91a5463-a0cd-40be-90b0-e01d8f1ebdf3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.184:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:48:54 crc kubenswrapper[4666]: I1203 12:48:54.298467 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 12:48:55 crc kubenswrapper[4666]: I1203 12:48:55.006970 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 12:48:55 crc kubenswrapper[4666]: I1203 12:48:55.024823 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:55 crc kubenswrapper[4666]: I1203 12:48:55.093995 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fg8p9"] Dec 03 12:48:55 crc kubenswrapper[4666]: I1203 12:48:55.906975 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 12:48:56 crc kubenswrapper[4666]: I1203 12:48:56.645868 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:56 crc kubenswrapper[4666]: I1203 12:48:56.864424 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zxcr"] Dec 03 12:48:56 crc kubenswrapper[4666]: I1203 12:48:56.975052 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zxcr" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="registry-server" containerID="cri-o://30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633" gracePeriod=2 Dec 03 12:48:56 crc kubenswrapper[4666]: I1203 12:48:56.975234 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fg8p9" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="registry-server" containerID="cri-o://f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd" gracePeriod=2 Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.550579 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.556251 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.701129 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-catalog-content\") pod \"5ff97f24-07fe-4314-a483-b39299527235\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.701234 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml6s8\" (UniqueName: \"kubernetes.io/projected/38838880-d425-4c9b-83a0-9585da2094a1-kube-api-access-ml6s8\") pod \"38838880-d425-4c9b-83a0-9585da2094a1\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.701297 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-utilities\") pod \"38838880-d425-4c9b-83a0-9585da2094a1\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.701336 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-utilities\") pod \"5ff97f24-07fe-4314-a483-b39299527235\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.701460 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-catalog-content\") pod \"38838880-d425-4c9b-83a0-9585da2094a1\" (UID: \"38838880-d425-4c9b-83a0-9585da2094a1\") " Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.701501 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlh9d\" (UniqueName: \"kubernetes.io/projected/5ff97f24-07fe-4314-a483-b39299527235-kube-api-access-nlh9d\") pod \"5ff97f24-07fe-4314-a483-b39299527235\" (UID: \"5ff97f24-07fe-4314-a483-b39299527235\") " Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.701976 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-utilities" (OuterVolumeSpecName: "utilities") pod "5ff97f24-07fe-4314-a483-b39299527235" (UID: "5ff97f24-07fe-4314-a483-b39299527235"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.703680 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-utilities" (OuterVolumeSpecName: "utilities") pod "38838880-d425-4c9b-83a0-9585da2094a1" (UID: "38838880-d425-4c9b-83a0-9585da2094a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.708573 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38838880-d425-4c9b-83a0-9585da2094a1-kube-api-access-ml6s8" (OuterVolumeSpecName: "kube-api-access-ml6s8") pod "38838880-d425-4c9b-83a0-9585da2094a1" (UID: "38838880-d425-4c9b-83a0-9585da2094a1"). InnerVolumeSpecName "kube-api-access-ml6s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.714251 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff97f24-07fe-4314-a483-b39299527235-kube-api-access-nlh9d" (OuterVolumeSpecName: "kube-api-access-nlh9d") pod "5ff97f24-07fe-4314-a483-b39299527235" (UID: "5ff97f24-07fe-4314-a483-b39299527235"). InnerVolumeSpecName "kube-api-access-nlh9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.760152 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38838880-d425-4c9b-83a0-9585da2094a1" (UID: "38838880-d425-4c9b-83a0-9585da2094a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.770160 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ff97f24-07fe-4314-a483-b39299527235" (UID: "5ff97f24-07fe-4314-a483-b39299527235"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.803916 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml6s8\" (UniqueName: \"kubernetes.io/projected/38838880-d425-4c9b-83a0-9585da2094a1-kube-api-access-ml6s8\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.803961 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.803971 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.803982 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38838880-d425-4c9b-83a0-9585da2094a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.803992 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlh9d\" (UniqueName: \"kubernetes.io/projected/5ff97f24-07fe-4314-a483-b39299527235-kube-api-access-nlh9d\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.804031 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff97f24-07fe-4314-a483-b39299527235-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.985659 4666 generic.go:334] "Generic (PLEG): container finished" podID="5ff97f24-07fe-4314-a483-b39299527235" containerID="30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633" exitCode=0 Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.985691 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zxcr" event={"ID":"5ff97f24-07fe-4314-a483-b39299527235","Type":"ContainerDied","Data":"30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633"} Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.985720 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zxcr" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.985733 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zxcr" event={"ID":"5ff97f24-07fe-4314-a483-b39299527235","Type":"ContainerDied","Data":"f39e4254930a5f3aa71a17ce280a40e82c83d8180bcf33e4146c2f934d96fe5f"} Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.985762 4666 scope.go:117] "RemoveContainer" containerID="30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633" Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.988921 4666 generic.go:334] "Generic (PLEG): container finished" podID="38838880-d425-4c9b-83a0-9585da2094a1" containerID="f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd" exitCode=0 Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.988967 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fg8p9" event={"ID":"38838880-d425-4c9b-83a0-9585da2094a1","Type":"ContainerDied","Data":"f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd"} Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.988997 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fg8p9" event={"ID":"38838880-d425-4c9b-83a0-9585da2094a1","Type":"ContainerDied","Data":"681f31882f768d2b0fe61c8f1a02f1f7bcc12bf1718cddba6ce0a8823addca2a"} Dec 03 12:48:57 crc kubenswrapper[4666]: I1203 12:48:57.989068 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fg8p9" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.015682 4666 scope.go:117] "RemoveContainer" containerID="516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.016015 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zxcr"] Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.026020 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zxcr"] Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.034670 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fg8p9"] Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.041224 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fg8p9"] Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.044518 4666 scope.go:117] "RemoveContainer" containerID="83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.064355 4666 scope.go:117] "RemoveContainer" containerID="30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633" Dec 03 12:48:58 crc kubenswrapper[4666]: E1203 12:48:58.064899 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633\": container with ID starting with 30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633 not found: ID does not exist" containerID="30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.064947 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633"} err="failed to get container status \"30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633\": rpc error: code = NotFound desc = could not find container \"30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633\": container with ID starting with 30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633 not found: ID does not exist" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.064977 4666 scope.go:117] "RemoveContainer" containerID="516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f" Dec 03 12:48:58 crc kubenswrapper[4666]: E1203 12:48:58.065540 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f\": container with ID starting with 516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f not found: ID does not exist" containerID="516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.065576 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f"} err="failed to get container status \"516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f\": rpc error: code = NotFound desc = could not find container \"516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f\": container with ID starting with 516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f not found: ID does not exist" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.065600 4666 scope.go:117] "RemoveContainer" containerID="83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe" Dec 03 12:48:58 crc kubenswrapper[4666]: E1203 12:48:58.065969 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe\": container with ID starting with 83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe not found: ID does not exist" containerID="83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.065995 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe"} err="failed to get container status \"83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe\": rpc error: code = NotFound desc = could not find container \"83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe\": container with ID starting with 83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe not found: ID does not exist" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.066011 4666 scope.go:117] "RemoveContainer" containerID="f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.084694 4666 scope.go:117] "RemoveContainer" containerID="cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.144319 4666 scope.go:117] "RemoveContainer" containerID="9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.175983 4666 scope.go:117] "RemoveContainer" containerID="f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd" Dec 03 12:48:58 crc kubenswrapper[4666]: E1203 12:48:58.176489 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd\": container with ID starting with f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd not found: ID does not exist" containerID="f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.176552 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd"} err="failed to get container status \"f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd\": rpc error: code = NotFound desc = could not find container \"f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd\": container with ID starting with f13ee1e279a85f437e031d126d06fef96688c38dba9f4969a70b5c5161cf9ddd not found: ID does not exist" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.176602 4666 scope.go:117] "RemoveContainer" containerID="cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a" Dec 03 12:48:58 crc kubenswrapper[4666]: E1203 12:48:58.176991 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a\": container with ID starting with cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a not found: ID does not exist" containerID="cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.177022 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a"} err="failed to get container status \"cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a\": rpc error: code = NotFound desc = could not find container \"cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a\": container with ID starting with cfc092076fa96f15219e53ffc523e97381b2bf0f0da9a1b23968f2461fca978a not found: ID does not exist" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.177047 4666 scope.go:117] "RemoveContainer" containerID="9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d" Dec 03 12:48:58 crc kubenswrapper[4666]: E1203 12:48:58.177721 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d\": container with ID starting with 9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d not found: ID does not exist" containerID="9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d" Dec 03 12:48:58 crc kubenswrapper[4666]: I1203 12:48:58.177750 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d"} err="failed to get container status \"9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d\": rpc error: code = NotFound desc = could not find container \"9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d\": container with ID starting with 9294d2b985ac3726ef19410f83b4fc1e87395cbfb1d3a22dae9835a3bcda879d not found: ID does not exist" Dec 03 12:48:59 crc kubenswrapper[4666]: E1203 12:48:59.192703 4666 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openstack_nova-scheduler-0_67e03c9f-9729-44b8-954e-128a4829d47c/nova-scheduler-scheduler/0.log" to get inode usage: stat /var/log/pods/openstack_nova-scheduler-0_67e03c9f-9729-44b8-954e-128a4829d47c/nova-scheduler-scheduler/0.log: no such file or directory Dec 03 12:48:59 crc kubenswrapper[4666]: I1203 12:48:59.460919 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38838880-d425-4c9b-83a0-9585da2094a1" path="/var/lib/kubelet/pods/38838880-d425-4c9b-83a0-9585da2094a1/volumes" Dec 03 12:48:59 crc kubenswrapper[4666]: I1203 12:48:59.462508 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff97f24-07fe-4314-a483-b39299527235" path="/var/lib/kubelet/pods/5ff97f24-07fe-4314-a483-b39299527235/volumes" Dec 03 12:49:00 crc kubenswrapper[4666]: W1203 12:49:00.689561 4666 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-conmon-516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-conmon-516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f.scope: no such file or directory Dec 03 12:49:00 crc kubenswrapper[4666]: W1203 12:49:00.689894 4666 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-516f042b6f79c6d7243d1d6edbbd483c3ffa9b00dfe88036aaa31b447c785d2f.scope: no such file or directory Dec 03 12:49:00 crc kubenswrapper[4666]: W1203 12:49:00.692252 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe.scope WatchSource:0}: Error finding container 83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe: Status 404 returned error can't find the container with id 83bb0c3d9051f778176e5da0e5e14a375b749ce9b60b9633c3fd35c7ba4b5ebe Dec 03 12:49:00 crc kubenswrapper[4666]: W1203 12:49:00.694972 4666 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-conmon-30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-conmon-30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633.scope: no such file or directory Dec 03 12:49:00 crc kubenswrapper[4666]: W1203 12:49:00.695052 4666 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-30c184fc5061a7d69c5ff0d6d15c2ea334b00e33d46f5c1fd6d6f6c8960af633.scope: no such file or directory Dec 03 12:49:00 crc kubenswrapper[4666]: W1203 12:49:00.695125 4666 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38838880_d425_4c9b_83a0_9585da2094a1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38838880_d425_4c9b_83a0_9585da2094a1.slice: no such file or directory Dec 03 12:49:00 crc kubenswrapper[4666]: E1203 12:49:00.954355 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2ed7f8_42cc_4266_a8c1_66873a5e0850.slice/crio-8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff97f24_07fe_4314_a483_b39299527235.slice/crio-f39e4254930a5f3aa71a17ce280a40e82c83d8180bcf33e4146c2f934d96fe5f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2ed7f8_42cc_4266_a8c1_66873a5e0850.slice/crio-conmon-8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.046730 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.052762 4666 generic.go:334] "Generic (PLEG): container finished" podID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerID="8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63" exitCode=137 Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.052814 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerDied","Data":"8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63"} Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.052850 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e2ed7f8-42cc-4266-a8c1-66873a5e0850","Type":"ContainerDied","Data":"8cf4c55beaebc9d81f7d8107429fa0620a5a979a44adaa7300d908733ef36609"} Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.052872 4666 scope.go:117] "RemoveContainer" containerID="8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.074717 4666 scope.go:117] "RemoveContainer" containerID="a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.097844 4666 scope.go:117] "RemoveContainer" containerID="a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.114473 4666 scope.go:117] "RemoveContainer" containerID="3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.132309 4666 scope.go:117] "RemoveContainer" containerID="8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63" Dec 03 12:49:01 crc kubenswrapper[4666]: E1203 12:49:01.132800 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63\": container with ID starting with 8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63 not found: ID does not exist" containerID="8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.132838 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63"} err="failed to get container status \"8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63\": rpc error: code = NotFound desc = could not find container \"8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63\": container with ID starting with 8c6e10fb3b1dafbc561c0c5220ead804ddd5c4d47221ac30017cbdc66d28cd63 not found: ID does not exist" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.132871 4666 scope.go:117] "RemoveContainer" containerID="a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1" Dec 03 12:49:01 crc kubenswrapper[4666]: E1203 12:49:01.133211 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1\": container with ID starting with a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1 not found: ID does not exist" containerID="a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.133232 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1"} err="failed to get container status \"a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1\": rpc error: code = NotFound desc = could not find container \"a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1\": container with ID starting with a9a8183c5c354b95c6c52e11669ac41040be0f2ad4aa7f510164c410c66155d1 not found: ID does not exist" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.133244 4666 scope.go:117] "RemoveContainer" containerID="a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1" Dec 03 12:49:01 crc kubenswrapper[4666]: E1203 12:49:01.133534 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1\": container with ID starting with a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1 not found: ID does not exist" containerID="a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.133556 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1"} err="failed to get container status \"a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1\": rpc error: code = NotFound desc = could not find container \"a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1\": container with ID starting with a120e6ceef4f3664b012d8abc849d2d20c4ccc0e206c083ff5cbd6153f45b8f1 not found: ID does not exist" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.133568 4666 scope.go:117] "RemoveContainer" containerID="3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b" Dec 03 12:49:01 crc kubenswrapper[4666]: E1203 12:49:01.133808 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b\": container with ID starting with 3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b not found: ID does not exist" containerID="3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.133856 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b"} err="failed to get container status \"3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b\": rpc error: code = NotFound desc = could not find container \"3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b\": container with ID starting with 3646141beca52ea8565e73ef4284efad008572a3885371fd777179a805e0803b not found: ID does not exist" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.184922 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-ceilometer-tls-certs\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.184968 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-config-data\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.185002 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-run-httpd\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.185069 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-scripts\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.185241 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-combined-ca-bundle\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.185284 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-sg-core-conf-yaml\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.185314 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-log-httpd\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.185347 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cf9r\" (UniqueName: \"kubernetes.io/projected/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-kube-api-access-4cf9r\") pod \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\" (UID: \"5e2ed7f8-42cc-4266-a8c1-66873a5e0850\") " Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.186216 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.186419 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.190856 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-scripts" (OuterVolumeSpecName: "scripts") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.191674 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-kube-api-access-4cf9r" (OuterVolumeSpecName: "kube-api-access-4cf9r") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "kube-api-access-4cf9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.211963 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.259984 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.270287 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.286804 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.286835 4666 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.286843 4666 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.286853 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cf9r\" (UniqueName: \"kubernetes.io/projected/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-kube-api-access-4cf9r\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.286863 4666 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.286870 4666 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.286879 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.301190 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-config-data" (OuterVolumeSpecName: "config-data") pod "5e2ed7f8-42cc-4266-a8c1-66873a5e0850" (UID: "5e2ed7f8-42cc-4266-a8c1-66873a5e0850"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:01 crc kubenswrapper[4666]: I1203 12:49:01.388698 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e2ed7f8-42cc-4266-a8c1-66873a5e0850-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.064005 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.096024 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.105822 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.130693 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131079 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="extract-utilities" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131120 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="extract-utilities" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131157 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="extract-content" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131166 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="extract-content" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131180 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-notification-agent" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131189 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-notification-agent" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131202 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="sg-core" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131209 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="sg-core" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131220 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="extract-utilities" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131227 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="extract-utilities" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131261 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-central-agent" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131270 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-central-agent" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131283 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="extract-content" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131290 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="extract-content" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131300 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="registry-server" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131308 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="registry-server" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131319 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="proxy-httpd" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131327 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="proxy-httpd" Dec 03 12:49:02 crc kubenswrapper[4666]: E1203 12:49:02.131338 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="registry-server" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131346 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="registry-server" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131554 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff97f24-07fe-4314-a483-b39299527235" containerName="registry-server" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131569 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-central-agent" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131580 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="proxy-httpd" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131591 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="sg-core" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131601 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="38838880-d425-4c9b-83a0-9585da2094a1" containerName="registry-server" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.131618 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" containerName="ceilometer-notification-agent" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.134589 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.138618 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.138646 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.138820 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.147048 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.206719 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.206891 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-scripts\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.206931 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.206977 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-config-data\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.207054 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-log-httpd\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.207194 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.207227 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvsf\" (UniqueName: \"kubernetes.io/projected/4d139cd6-1cd0-4d2b-a353-808575a9d272-kube-api-access-jwvsf\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.207263 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-run-httpd\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.308481 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-run-httpd\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.308548 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.308601 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-scripts\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.308862 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.308896 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-config-data\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.308929 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-log-httpd\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.309029 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.309049 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvsf\" (UniqueName: \"kubernetes.io/projected/4d139cd6-1cd0-4d2b-a353-808575a9d272-kube-api-access-jwvsf\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.309124 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-run-httpd\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.309719 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-log-httpd\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.313046 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-config-data\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.313788 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-scripts\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.313868 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.314619 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.325380 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvsf\" (UniqueName: \"kubernetes.io/projected/4d139cd6-1cd0-4d2b-a353-808575a9d272-kube-api-access-jwvsf\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.325448 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " pod="openstack/ceilometer-0" Dec 03 12:49:02 crc kubenswrapper[4666]: I1203 12:49:02.453858 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:49:03 crc kubenswrapper[4666]: W1203 12:49:03.006759 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d139cd6_1cd0_4d2b_a353_808575a9d272.slice/crio-38eef3fee6c9d8a2624553ac19ea480a62e31fd015789daf9028dd7522472a9e WatchSource:0}: Error finding container 38eef3fee6c9d8a2624553ac19ea480a62e31fd015789daf9028dd7522472a9e: Status 404 returned error can't find the container with id 38eef3fee6c9d8a2624553ac19ea480a62e31fd015789daf9028dd7522472a9e Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.008305 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.083059 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerStarted","Data":"38eef3fee6c9d8a2624553ac19ea480a62e31fd015789daf9028dd7522472a9e"} Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.226213 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.231792 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.234984 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.278627 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.279007 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.280702 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.288729 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 12:49:03 crc kubenswrapper[4666]: I1203 12:49:03.433508 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e2ed7f8-42cc-4266-a8c1-66873a5e0850" path="/var/lib/kubelet/pods/5e2ed7f8-42cc-4266-a8c1-66873a5e0850/volumes" Dec 03 12:49:04 crc kubenswrapper[4666]: I1203 12:49:04.094978 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerStarted","Data":"65a67164bd8029fc0805f9561d317fb45c899cf4dfb2f22fb379789a99c68945"} Dec 03 12:49:04 crc kubenswrapper[4666]: I1203 12:49:04.095450 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 12:49:04 crc kubenswrapper[4666]: I1203 12:49:04.100152 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 12:49:04 crc kubenswrapper[4666]: I1203 12:49:04.109938 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 12:49:05 crc kubenswrapper[4666]: I1203 12:49:05.103562 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerStarted","Data":"888dbaf538a49771c40e431e7d07815607819755fed8244ae3263094bbbb5940"} Dec 03 12:49:06 crc kubenswrapper[4666]: I1203 12:49:06.125901 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerStarted","Data":"26aea19984fdc2ea693683e6ced5e4ac73f66bffde3b5ddc5109d70e52f5395d"} Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.146292 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerStarted","Data":"1eff69a01d383ce295b54faa506ef0c4c14e591212a901c04257e29cc5b8d928"} Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.146641 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.181659 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.543280322 podStartE2EDuration="5.181639199s" podCreationTimestamp="2025-12-03 12:49:02 +0000 UTC" firstStartedPulling="2025-12-03 12:49:03.011439126 +0000 UTC m=+2131.856400217" lastFinishedPulling="2025-12-03 12:49:06.649798043 +0000 UTC m=+2135.494759094" observedRunningTime="2025-12-03 12:49:07.172510504 +0000 UTC m=+2136.017471575" watchObservedRunningTime="2025-12-03 12:49:07.181639199 +0000 UTC m=+2136.026600250" Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.771740 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jr8fr"] Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.773842 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.785516 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jr8fr"] Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.940363 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-utilities\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.940807 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-catalog-content\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:07 crc kubenswrapper[4666]: I1203 12:49:07.940974 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svvcs\" (UniqueName: \"kubernetes.io/projected/1648b503-7656-4eeb-8774-14ca927f3b14-kube-api-access-svvcs\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.042472 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svvcs\" (UniqueName: \"kubernetes.io/projected/1648b503-7656-4eeb-8774-14ca927f3b14-kube-api-access-svvcs\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.042564 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-utilities\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.042650 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-catalog-content\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.043182 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-utilities\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.043212 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-catalog-content\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.070651 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svvcs\" (UniqueName: \"kubernetes.io/projected/1648b503-7656-4eeb-8774-14ca927f3b14-kube-api-access-svvcs\") pod \"redhat-operators-jr8fr\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.156354 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:08 crc kubenswrapper[4666]: I1203 12:49:08.608363 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jr8fr"] Dec 03 12:49:08 crc kubenswrapper[4666]: W1203 12:49:08.609618 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1648b503_7656_4eeb_8774_14ca927f3b14.slice/crio-16ad5edd0eb0483b5bb908e10cc7752c272c97155d73c2a6f8f66928a2d15c2f WatchSource:0}: Error finding container 16ad5edd0eb0483b5bb908e10cc7752c272c97155d73c2a6f8f66928a2d15c2f: Status 404 returned error can't find the container with id 16ad5edd0eb0483b5bb908e10cc7752c272c97155d73c2a6f8f66928a2d15c2f Dec 03 12:49:09 crc kubenswrapper[4666]: I1203 12:49:09.164530 4666 generic.go:334] "Generic (PLEG): container finished" podID="1648b503-7656-4eeb-8774-14ca927f3b14" containerID="faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7" exitCode=0 Dec 03 12:49:09 crc kubenswrapper[4666]: I1203 12:49:09.164570 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jr8fr" event={"ID":"1648b503-7656-4eeb-8774-14ca927f3b14","Type":"ContainerDied","Data":"faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7"} Dec 03 12:49:09 crc kubenswrapper[4666]: I1203 12:49:09.164834 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jr8fr" event={"ID":"1648b503-7656-4eeb-8774-14ca927f3b14","Type":"ContainerStarted","Data":"16ad5edd0eb0483b5bb908e10cc7752c272c97155d73c2a6f8f66928a2d15c2f"} Dec 03 12:49:09 crc kubenswrapper[4666]: I1203 12:49:09.866184 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:49:09 crc kubenswrapper[4666]: I1203 12:49:09.866512 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:49:10 crc kubenswrapper[4666]: I1203 12:49:10.204962 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jr8fr" event={"ID":"1648b503-7656-4eeb-8774-14ca927f3b14","Type":"ContainerStarted","Data":"3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918"} Dec 03 12:49:14 crc kubenswrapper[4666]: I1203 12:49:14.271058 4666 generic.go:334] "Generic (PLEG): container finished" podID="1648b503-7656-4eeb-8774-14ca927f3b14" containerID="3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918" exitCode=0 Dec 03 12:49:14 crc kubenswrapper[4666]: I1203 12:49:14.271659 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jr8fr" event={"ID":"1648b503-7656-4eeb-8774-14ca927f3b14","Type":"ContainerDied","Data":"3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918"} Dec 03 12:49:15 crc kubenswrapper[4666]: I1203 12:49:15.285845 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jr8fr" event={"ID":"1648b503-7656-4eeb-8774-14ca927f3b14","Type":"ContainerStarted","Data":"cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7"} Dec 03 12:49:15 crc kubenswrapper[4666]: I1203 12:49:15.312916 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jr8fr" podStartSLOduration=2.81817466 podStartE2EDuration="8.312895515s" podCreationTimestamp="2025-12-03 12:49:07 +0000 UTC" firstStartedPulling="2025-12-03 12:49:09.166236679 +0000 UTC m=+2138.011197730" lastFinishedPulling="2025-12-03 12:49:14.660957534 +0000 UTC m=+2143.505918585" observedRunningTime="2025-12-03 12:49:15.306620168 +0000 UTC m=+2144.151581229" watchObservedRunningTime="2025-12-03 12:49:15.312895515 +0000 UTC m=+2144.157856566" Dec 03 12:49:18 crc kubenswrapper[4666]: I1203 12:49:18.156683 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:18 crc kubenswrapper[4666]: I1203 12:49:18.157195 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:19 crc kubenswrapper[4666]: I1203 12:49:19.204651 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jr8fr" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="registry-server" probeResult="failure" output=< Dec 03 12:49:19 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 12:49:19 crc kubenswrapper[4666]: > Dec 03 12:49:28 crc kubenswrapper[4666]: I1203 12:49:28.232787 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:28 crc kubenswrapper[4666]: I1203 12:49:28.290295 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:28 crc kubenswrapper[4666]: I1203 12:49:28.462603 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jr8fr"] Dec 03 12:49:29 crc kubenswrapper[4666]: I1203 12:49:29.439351 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jr8fr" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="registry-server" containerID="cri-o://cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7" gracePeriod=2 Dec 03 12:49:29 crc kubenswrapper[4666]: I1203 12:49:29.874358 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:29 crc kubenswrapper[4666]: I1203 12:49:29.965686 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svvcs\" (UniqueName: \"kubernetes.io/projected/1648b503-7656-4eeb-8774-14ca927f3b14-kube-api-access-svvcs\") pod \"1648b503-7656-4eeb-8774-14ca927f3b14\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " Dec 03 12:49:29 crc kubenswrapper[4666]: I1203 12:49:29.965935 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-utilities\") pod \"1648b503-7656-4eeb-8774-14ca927f3b14\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " Dec 03 12:49:29 crc kubenswrapper[4666]: I1203 12:49:29.966129 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-catalog-content\") pod \"1648b503-7656-4eeb-8774-14ca927f3b14\" (UID: \"1648b503-7656-4eeb-8774-14ca927f3b14\") " Dec 03 12:49:29 crc kubenswrapper[4666]: I1203 12:49:29.967042 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-utilities" (OuterVolumeSpecName: "utilities") pod "1648b503-7656-4eeb-8774-14ca927f3b14" (UID: "1648b503-7656-4eeb-8774-14ca927f3b14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:29 crc kubenswrapper[4666]: I1203 12:49:29.971963 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1648b503-7656-4eeb-8774-14ca927f3b14-kube-api-access-svvcs" (OuterVolumeSpecName: "kube-api-access-svvcs") pod "1648b503-7656-4eeb-8774-14ca927f3b14" (UID: "1648b503-7656-4eeb-8774-14ca927f3b14"). InnerVolumeSpecName "kube-api-access-svvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.067672 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svvcs\" (UniqueName: \"kubernetes.io/projected/1648b503-7656-4eeb-8774-14ca927f3b14-kube-api-access-svvcs\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.068037 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.072941 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1648b503-7656-4eeb-8774-14ca927f3b14" (UID: "1648b503-7656-4eeb-8774-14ca927f3b14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.169550 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1648b503-7656-4eeb-8774-14ca927f3b14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.449932 4666 generic.go:334] "Generic (PLEG): container finished" podID="1648b503-7656-4eeb-8774-14ca927f3b14" containerID="cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7" exitCode=0 Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.449981 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jr8fr" event={"ID":"1648b503-7656-4eeb-8774-14ca927f3b14","Type":"ContainerDied","Data":"cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7"} Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.450017 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jr8fr" event={"ID":"1648b503-7656-4eeb-8774-14ca927f3b14","Type":"ContainerDied","Data":"16ad5edd0eb0483b5bb908e10cc7752c272c97155d73c2a6f8f66928a2d15c2f"} Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.450038 4666 scope.go:117] "RemoveContainer" containerID="cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.450988 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jr8fr" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.473014 4666 scope.go:117] "RemoveContainer" containerID="3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.493741 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jr8fr"] Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.500722 4666 scope.go:117] "RemoveContainer" containerID="faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.503505 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jr8fr"] Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.535560 4666 scope.go:117] "RemoveContainer" containerID="cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7" Dec 03 12:49:30 crc kubenswrapper[4666]: E1203 12:49:30.536282 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7\": container with ID starting with cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7 not found: ID does not exist" containerID="cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.536345 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7"} err="failed to get container status \"cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7\": rpc error: code = NotFound desc = could not find container \"cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7\": container with ID starting with cb65ff0ee0251b3b0f187cdaff71b99e60deb14601bd3afb5c3f66ea3f96b5a7 not found: ID does not exist" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.536378 4666 scope.go:117] "RemoveContainer" containerID="3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918" Dec 03 12:49:30 crc kubenswrapper[4666]: E1203 12:49:30.537081 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918\": container with ID starting with 3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918 not found: ID does not exist" containerID="3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.537124 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918"} err="failed to get container status \"3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918\": rpc error: code = NotFound desc = could not find container \"3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918\": container with ID starting with 3d8ca4572e39b5af1251f86f5249bc3bf7f54fd1947b1931c9a256f85b064918 not found: ID does not exist" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.537143 4666 scope.go:117] "RemoveContainer" containerID="faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7" Dec 03 12:49:30 crc kubenswrapper[4666]: E1203 12:49:30.537570 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7\": container with ID starting with faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7 not found: ID does not exist" containerID="faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7" Dec 03 12:49:30 crc kubenswrapper[4666]: I1203 12:49:30.537595 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7"} err="failed to get container status \"faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7\": rpc error: code = NotFound desc = could not find container \"faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7\": container with ID starting with faddbe08f6fe7122a9aa499b6634caaa149c9cc58ec9c0c7470cdd882290dab7 not found: ID does not exist" Dec 03 12:49:31 crc kubenswrapper[4666]: I1203 12:49:31.450894 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" path="/var/lib/kubelet/pods/1648b503-7656-4eeb-8774-14ca927f3b14/volumes" Dec 03 12:49:32 crc kubenswrapper[4666]: I1203 12:49:32.462629 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 12:49:39 crc kubenswrapper[4666]: I1203 12:49:39.866437 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:49:39 crc kubenswrapper[4666]: I1203 12:49:39.866943 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:49:42 crc kubenswrapper[4666]: I1203 12:49:42.434507 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:49:43 crc kubenswrapper[4666]: I1203 12:49:43.188661 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:49:46 crc kubenswrapper[4666]: I1203 12:49:46.797275 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="rabbitmq" containerID="cri-o://dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488" gracePeriod=604796 Dec 03 12:49:47 crc kubenswrapper[4666]: I1203 12:49:47.179129 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="rabbitmq" containerID="cri-o://68cd18f4ae6aaff7863fe5b233fc1030b3b24abe7733723b0a0c020849b4e998" gracePeriod=604797 Dec 03 12:49:47 crc kubenswrapper[4666]: I1203 12:49:47.488320 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Dec 03 12:49:47 crc kubenswrapper[4666]: I1203 12:49:47.613880 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.550277 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613022 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-server-conf\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613089 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-tls\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613132 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-erlang-cookie\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613157 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37ef012c-8962-43f6-9c95-5a880aa57d5a-erlang-cookie-secret\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613198 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37ef012c-8962-43f6-9c95-5a880aa57d5a-pod-info\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613219 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-config-data\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613268 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-plugins\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613430 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-confd\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613672 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-plugins-conf\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613699 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7gp4\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-kube-api-access-v7gp4\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.613718 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"37ef012c-8962-43f6-9c95-5a880aa57d5a\" (UID: \"37ef012c-8962-43f6-9c95-5a880aa57d5a\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.617418 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.619259 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.622408 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.624975 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.626770 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/37ef012c-8962-43f6-9c95-5a880aa57d5a-pod-info" (OuterVolumeSpecName: "pod-info") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.632387 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.643200 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ef012c-8962-43f6-9c95-5a880aa57d5a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.643211 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-kube-api-access-v7gp4" (OuterVolumeSpecName: "kube-api-access-v7gp4") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "kube-api-access-v7gp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.677727 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-config-data" (OuterVolumeSpecName: "config-data") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.690347 4666 generic.go:334] "Generic (PLEG): container finished" podID="49ae1478-c8e5-4175-bf32-f96a34996999" containerID="68cd18f4ae6aaff7863fe5b233fc1030b3b24abe7733723b0a0c020849b4e998" exitCode=0 Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.690421 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"49ae1478-c8e5-4175-bf32-f96a34996999","Type":"ContainerDied","Data":"68cd18f4ae6aaff7863fe5b233fc1030b3b24abe7733723b0a0c020849b4e998"} Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.693257 4666 generic.go:334] "Generic (PLEG): container finished" podID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerID="dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488" exitCode=0 Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.693290 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37ef012c-8962-43f6-9c95-5a880aa57d5a","Type":"ContainerDied","Data":"dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488"} Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.693323 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"37ef012c-8962-43f6-9c95-5a880aa57d5a","Type":"ContainerDied","Data":"f83c3755957db5418fd4963f1cb6afc786cbf8d020deb5cc2f67586c3b756754"} Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.693344 4666 scope.go:117] "RemoveContainer" containerID="dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.693607 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.707329 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-server-conf" (OuterVolumeSpecName: "server-conf") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717027 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717066 4666 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717079 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7gp4\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-kube-api-access-v7gp4\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717131 4666 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717145 4666 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717157 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717190 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717200 4666 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37ef012c-8962-43f6-9c95-5a880aa57d5a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717211 4666 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37ef012c-8962-43f6-9c95-5a880aa57d5a-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.717222 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37ef012c-8962-43f6-9c95-5a880aa57d5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.736378 4666 scope.go:117] "RemoveContainer" containerID="3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.739427 4666 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.762265 4666 scope.go:117] "RemoveContainer" containerID="dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488" Dec 03 12:49:53 crc kubenswrapper[4666]: E1203 12:49:53.763405 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488\": container with ID starting with dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488 not found: ID does not exist" containerID="dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.763437 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488"} err="failed to get container status \"dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488\": rpc error: code = NotFound desc = could not find container \"dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488\": container with ID starting with dbc8289803348128431c3e4b1eeb027913671bc00061fae3a5c8c07a8dd02488 not found: ID does not exist" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.763459 4666 scope.go:117] "RemoveContainer" containerID="3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279" Dec 03 12:49:53 crc kubenswrapper[4666]: E1203 12:49:53.763787 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279\": container with ID starting with 3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279 not found: ID does not exist" containerID="3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.763821 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279"} err="failed to get container status \"3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279\": rpc error: code = NotFound desc = could not find container \"3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279\": container with ID starting with 3247a1320b1307de3ce620c617798c47693f6adbf941c3184782f2ec2d24a279 not found: ID does not exist" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.784582 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.808295 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "37ef012c-8962-43f6-9c95-5a880aa57d5a" (UID: "37ef012c-8962-43f6-9c95-5a880aa57d5a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.817962 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-erlang-cookie\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818041 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-plugins\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818126 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ae1478-c8e5-4175-bf32-f96a34996999-pod-info\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818182 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-config-data\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818216 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818245 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-confd\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818262 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-server-conf\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818290 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whbmw\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-kube-api-access-whbmw\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818320 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-tls\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818376 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ae1478-c8e5-4175-bf32-f96a34996999-erlang-cookie-secret\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.818410 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-plugins-conf\") pod \"49ae1478-c8e5-4175-bf32-f96a34996999\" (UID: \"49ae1478-c8e5-4175-bf32-f96a34996999\") " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.819067 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37ef012c-8962-43f6-9c95-5a880aa57d5a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.819103 4666 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.820456 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.821362 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.821821 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.822334 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.825941 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-kube-api-access-whbmw" (OuterVolumeSpecName: "kube-api-access-whbmw") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "kube-api-access-whbmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.886797 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/49ae1478-c8e5-4175-bf32-f96a34996999-pod-info" (OuterVolumeSpecName: "pod-info") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.887038 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.887232 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-config-data" (OuterVolumeSpecName: "config-data") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.900304 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ae1478-c8e5-4175-bf32-f96a34996999-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.916255 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-server-conf" (OuterVolumeSpecName: "server-conf") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920377 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920419 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920433 4666 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49ae1478-c8e5-4175-bf32-f96a34996999-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920446 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920482 4666 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920494 4666 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920507 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whbmw\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-kube-api-access-whbmw\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920518 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920529 4666 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49ae1478-c8e5-4175-bf32-f96a34996999-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.920539 4666 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49ae1478-c8e5-4175-bf32-f96a34996999-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.941238 4666 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 12:49:53 crc kubenswrapper[4666]: I1203 12:49:53.967200 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "49ae1478-c8e5-4175-bf32-f96a34996999" (UID: "49ae1478-c8e5-4175-bf32-f96a34996999"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.022442 4666 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.022491 4666 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49ae1478-c8e5-4175-bf32-f96a34996999-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.102363 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.109239 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136280 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: E1203 12:49:54.136706 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="setup-container" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136731 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="setup-container" Dec 03 12:49:54 crc kubenswrapper[4666]: E1203 12:49:54.136748 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="setup-container" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136756 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="setup-container" Dec 03 12:49:54 crc kubenswrapper[4666]: E1203 12:49:54.136764 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136771 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4666]: E1203 12:49:54.136788 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="extract-content" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136795 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="extract-content" Dec 03 12:49:54 crc kubenswrapper[4666]: E1203 12:49:54.136812 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="rabbitmq" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136820 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="rabbitmq" Dec 03 12:49:54 crc kubenswrapper[4666]: E1203 12:49:54.136853 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="rabbitmq" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136860 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="rabbitmq" Dec 03 12:49:54 crc kubenswrapper[4666]: E1203 12:49:54.136873 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="extract-utilities" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.136880 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="extract-utilities" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.137072 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" containerName="rabbitmq" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.137110 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" containerName="rabbitmq" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.137139 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1648b503-7656-4eeb-8774-14ca927f3b14" containerName="registry-server" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.138228 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.140860 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.141334 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.141556 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.141826 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.141950 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.142069 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jmqzx" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.144471 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.151559 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225499 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl258\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-kube-api-access-kl258\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225552 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225607 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225633 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225682 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a39a37c-b566-4726-8e2b-84be35262830-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225702 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225804 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.225898 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a39a37c-b566-4726-8e2b-84be35262830-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.226015 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-config-data\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.226068 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.226201 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.327809 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-config-data\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.327855 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.327896 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.327929 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl258\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-kube-api-access-kl258\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.327954 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.327998 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.328014 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.328037 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a39a37c-b566-4726-8e2b-84be35262830-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.328052 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.328116 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.328168 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a39a37c-b566-4726-8e2b-84be35262830-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.328720 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.328063 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.329044 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-config-data\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.329317 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.329412 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.329428 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a39a37c-b566-4726-8e2b-84be35262830-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.334264 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a39a37c-b566-4726-8e2b-84be35262830-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.334289 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.334846 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a39a37c-b566-4726-8e2b-84be35262830-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.335821 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.354319 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl258\" (UniqueName: \"kubernetes.io/projected/1a39a37c-b566-4726-8e2b-84be35262830-kube-api-access-kl258\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.361995 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1a39a37c-b566-4726-8e2b-84be35262830\") " pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.456431 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.705987 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"49ae1478-c8e5-4175-bf32-f96a34996999","Type":"ContainerDied","Data":"edfe886cb548f638548646264377e416e66f2b4d59fa609709d07d0e79c3f088"} Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.706373 4666 scope.go:117] "RemoveContainer" containerID="68cd18f4ae6aaff7863fe5b233fc1030b3b24abe7733723b0a0c020849b4e998" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.706486 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.742222 4666 scope.go:117] "RemoveContainer" containerID="fca316e7ded99ec6b82b57df1075d0e8bf2f795f283750a00ec6b65878f7525c" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.756277 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.779562 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.789233 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.791279 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.793383 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.794984 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.795232 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.795338 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.795478 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.795679 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.795815 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xb68b" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.796276 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.919122 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.939622 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rp9r\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-kube-api-access-9rp9r\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.939725 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.939790 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.939815 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.939871 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.939980 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.940026 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.940205 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.940229 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20f2c961-32c5-4a6e-8d18-5296c889d4a3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.940297 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20f2c961-32c5-4a6e-8d18-5296c889d4a3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:54 crc kubenswrapper[4666]: I1203 12:49:54.940351 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041620 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041732 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20f2c961-32c5-4a6e-8d18-5296c889d4a3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041759 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20f2c961-32c5-4a6e-8d18-5296c889d4a3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041805 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041850 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rp9r\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-kube-api-access-9rp9r\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041876 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041897 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041913 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041939 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041960 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.041977 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.042138 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.042480 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.043435 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.043806 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.044954 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.045144 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20f2c961-32c5-4a6e-8d18-5296c889d4a3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.046402 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.047064 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20f2c961-32c5-4a6e-8d18-5296c889d4a3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.050587 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.052205 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20f2c961-32c5-4a6e-8d18-5296c889d4a3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.063683 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rp9r\" (UniqueName: \"kubernetes.io/projected/20f2c961-32c5-4a6e-8d18-5296c889d4a3-kube-api-access-9rp9r\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.076896 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"20f2c961-32c5-4a6e-8d18-5296c889d4a3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.126812 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.436627 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ef012c-8962-43f6-9c95-5a880aa57d5a" path="/var/lib/kubelet/pods/37ef012c-8962-43f6-9c95-5a880aa57d5a/volumes" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.437620 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ae1478-c8e5-4175-bf32-f96a34996999" path="/var/lib/kubelet/pods/49ae1478-c8e5-4175-bf32-f96a34996999/volumes" Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.611389 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.726507 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20f2c961-32c5-4a6e-8d18-5296c889d4a3","Type":"ContainerStarted","Data":"b2a91b71bdf03e5d18247e55e87ba67799105aa7bc1b032bc087261e86a008f2"} Dec 03 12:49:55 crc kubenswrapper[4666]: I1203 12:49:55.729843 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a39a37c-b566-4726-8e2b-84be35262830","Type":"ContainerStarted","Data":"53a7b5277ee9783fb51a521406420902c4591539964f23f6f80088a83fa7c976"} Dec 03 12:49:57 crc kubenswrapper[4666]: I1203 12:49:57.750120 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a39a37c-b566-4726-8e2b-84be35262830","Type":"ContainerStarted","Data":"a1baaeaa7d76ef9c5d3239b1b13556f0760f02768082e631cb40ced7f66b5b91"} Dec 03 12:49:57 crc kubenswrapper[4666]: I1203 12:49:57.751713 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20f2c961-32c5-4a6e-8d18-5296c889d4a3","Type":"ContainerStarted","Data":"6854770ab2dfa16a371954d5b885c5328bd5685757b261aae3b024de3c007ae5"} Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.048999 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2h4fj"] Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.050530 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.052798 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.069971 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2h4fj"] Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.197047 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd68c\" (UniqueName: \"kubernetes.io/projected/d4b759eb-b3db-4d3e-87a3-3098c96f2273-kube-api-access-pd68c\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.197117 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.197157 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.197288 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.197557 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.197658 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-config\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.299877 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.299939 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-config\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.300028 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd68c\" (UniqueName: \"kubernetes.io/projected/d4b759eb-b3db-4d3e-87a3-3098c96f2273-kube-api-access-pd68c\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.300058 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.300081 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.300135 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.300818 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.300862 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.300939 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-config\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.301492 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.301507 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.318958 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd68c\" (UniqueName: \"kubernetes.io/projected/d4b759eb-b3db-4d3e-87a3-3098c96f2273-kube-api-access-pd68c\") pod \"dnsmasq-dns-6447ccbd8f-2h4fj\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.369411 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:49:58 crc kubenswrapper[4666]: I1203 12:49:58.839853 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2h4fj"] Dec 03 12:49:58 crc kubenswrapper[4666]: W1203 12:49:58.846609 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4b759eb_b3db_4d3e_87a3_3098c96f2273.slice/crio-53a035dbc099cf88b5ebf1a755804490fcedbc645390bddc7596c7930b34bbf6 WatchSource:0}: Error finding container 53a035dbc099cf88b5ebf1a755804490fcedbc645390bddc7596c7930b34bbf6: Status 404 returned error can't find the container with id 53a035dbc099cf88b5ebf1a755804490fcedbc645390bddc7596c7930b34bbf6 Dec 03 12:49:59 crc kubenswrapper[4666]: I1203 12:49:59.768696 4666 generic.go:334] "Generic (PLEG): container finished" podID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerID="bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad" exitCode=0 Dec 03 12:49:59 crc kubenswrapper[4666]: I1203 12:49:59.768753 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" event={"ID":"d4b759eb-b3db-4d3e-87a3-3098c96f2273","Type":"ContainerDied","Data":"bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad"} Dec 03 12:49:59 crc kubenswrapper[4666]: I1203 12:49:59.769012 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" event={"ID":"d4b759eb-b3db-4d3e-87a3-3098c96f2273","Type":"ContainerStarted","Data":"53a035dbc099cf88b5ebf1a755804490fcedbc645390bddc7596c7930b34bbf6"} Dec 03 12:50:00 crc kubenswrapper[4666]: I1203 12:50:00.779796 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" event={"ID":"d4b759eb-b3db-4d3e-87a3-3098c96f2273","Type":"ContainerStarted","Data":"5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d"} Dec 03 12:50:00 crc kubenswrapper[4666]: I1203 12:50:00.780012 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:50:00 crc kubenswrapper[4666]: I1203 12:50:00.803517 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" podStartSLOduration=2.8034955999999998 podStartE2EDuration="2.8034956s" podCreationTimestamp="2025-12-03 12:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:00.797623834 +0000 UTC m=+2189.642584895" watchObservedRunningTime="2025-12-03 12:50:00.8034956 +0000 UTC m=+2189.648456671" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.371253 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.452882 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lf8px"] Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.458844 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerName="dnsmasq-dns" containerID="cri-o://dee5259a64b28f0cbe97b94d789142f18a267184cbca8497b1ce9c399fd6f43a" gracePeriod=10 Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.596315 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-ncm8v"] Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.598585 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.607942 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-ncm8v"] Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.691530 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mvxw\" (UniqueName: \"kubernetes.io/projected/30859df6-46bb-4671-a9e9-def5132425af-kube-api-access-5mvxw\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.691821 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.691876 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.691966 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.692073 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-config\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.692189 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.794081 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-config\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.794960 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-config\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.795686 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.796676 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.796722 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mvxw\" (UniqueName: \"kubernetes.io/projected/30859df6-46bb-4671-a9e9-def5132425af-kube-api-access-5mvxw\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.796953 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.796987 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.797034 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.797715 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.797803 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.798074 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.821239 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mvxw\" (UniqueName: \"kubernetes.io/projected/30859df6-46bb-4671-a9e9-def5132425af-kube-api-access-5mvxw\") pod \"dnsmasq-dns-864d5fc68c-ncm8v\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:08 crc kubenswrapper[4666]: I1203 12:50:08.943418 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:09 crc kubenswrapper[4666]: I1203 12:50:09.366081 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-ncm8v"] Dec 03 12:50:09 crc kubenswrapper[4666]: W1203 12:50:09.371755 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30859df6_46bb_4671_a9e9_def5132425af.slice/crio-53f33e460d52f12c72a0902fbb8b237e8c3d2e12c79850cbfe44776648162b36 WatchSource:0}: Error finding container 53f33e460d52f12c72a0902fbb8b237e8c3d2e12c79850cbfe44776648162b36: Status 404 returned error can't find the container with id 53f33e460d52f12c72a0902fbb8b237e8c3d2e12c79850cbfe44776648162b36 Dec 03 12:50:09 crc kubenswrapper[4666]: I1203 12:50:09.862219 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" event={"ID":"30859df6-46bb-4671-a9e9-def5132425af","Type":"ContainerStarted","Data":"53f33e460d52f12c72a0902fbb8b237e8c3d2e12c79850cbfe44776648162b36"} Dec 03 12:50:09 crc kubenswrapper[4666]: I1203 12:50:09.867972 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:50:09 crc kubenswrapper[4666]: I1203 12:50:09.868053 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:50:09 crc kubenswrapper[4666]: I1203 12:50:09.868155 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:50:09 crc kubenswrapper[4666]: I1203 12:50:09.869153 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baccbee2e06aeeb4d11e51e541c48984c3da2edee76ae78a953b9be250435078"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:50:09 crc kubenswrapper[4666]: I1203 12:50:09.869264 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://baccbee2e06aeeb4d11e51e541c48984c3da2edee76ae78a953b9be250435078" gracePeriod=600 Dec 03 12:50:12 crc kubenswrapper[4666]: I1203 12:50:12.046600 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.178:5353: connect: connection refused" Dec 03 12:50:14 crc kubenswrapper[4666]: I1203 12:50:14.727929 4666 generic.go:334] "Generic (PLEG): container finished" podID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerID="dee5259a64b28f0cbe97b94d789142f18a267184cbca8497b1ce9c399fd6f43a" exitCode=0 Dec 03 12:50:14 crc kubenswrapper[4666]: I1203 12:50:14.728043 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" event={"ID":"c2ace9fc-00fe-4ec8-9f86-770e476d30e8","Type":"ContainerDied","Data":"dee5259a64b28f0cbe97b94d789142f18a267184cbca8497b1ce9c399fd6f43a"} Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.249436 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.352224 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-nb\") pod \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.352346 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zv9n\" (UniqueName: \"kubernetes.io/projected/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-kube-api-access-5zv9n\") pod \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.352468 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-dns-svc\") pod \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.352559 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-sb\") pod \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.352612 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-config\") pod \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\" (UID: \"c2ace9fc-00fe-4ec8-9f86-770e476d30e8\") " Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.363375 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-kube-api-access-5zv9n" (OuterVolumeSpecName: "kube-api-access-5zv9n") pod "c2ace9fc-00fe-4ec8-9f86-770e476d30e8" (UID: "c2ace9fc-00fe-4ec8-9f86-770e476d30e8"). InnerVolumeSpecName "kube-api-access-5zv9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.398729 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-config" (OuterVolumeSpecName: "config") pod "c2ace9fc-00fe-4ec8-9f86-770e476d30e8" (UID: "c2ace9fc-00fe-4ec8-9f86-770e476d30e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.403216 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2ace9fc-00fe-4ec8-9f86-770e476d30e8" (UID: "c2ace9fc-00fe-4ec8-9f86-770e476d30e8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.407591 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c2ace9fc-00fe-4ec8-9f86-770e476d30e8" (UID: "c2ace9fc-00fe-4ec8-9f86-770e476d30e8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.417565 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2ace9fc-00fe-4ec8-9f86-770e476d30e8" (UID: "c2ace9fc-00fe-4ec8-9f86-770e476d30e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.455179 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.455208 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.455224 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zv9n\" (UniqueName: \"kubernetes.io/projected/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-kube-api-access-5zv9n\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.455234 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.455243 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ace9fc-00fe-4ec8-9f86-770e476d30e8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.741967 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" event={"ID":"c2ace9fc-00fe-4ec8-9f86-770e476d30e8","Type":"ContainerDied","Data":"b66e8e3b675d2c5b34f1f896bdfe539b5ef61c26791a1e09b9edd88c130c73d4"} Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.742033 4666 scope.go:117] "RemoveContainer" containerID="dee5259a64b28f0cbe97b94d789142f18a267184cbca8497b1ce9c399fd6f43a" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.742051 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-lf8px" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.746869 4666 generic.go:334] "Generic (PLEG): container finished" podID="30859df6-46bb-4671-a9e9-def5132425af" containerID="7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2" exitCode=0 Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.746946 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" event={"ID":"30859df6-46bb-4671-a9e9-def5132425af","Type":"ContainerDied","Data":"7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2"} Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.753022 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="baccbee2e06aeeb4d11e51e541c48984c3da2edee76ae78a953b9be250435078" exitCode=0 Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.753067 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"baccbee2e06aeeb4d11e51e541c48984c3da2edee76ae78a953b9be250435078"} Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.753109 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878"} Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.771221 4666 scope.go:117] "RemoveContainer" containerID="258164d55eb35df9b31648e05d73a4e613d3c242e2e80c6ae1cf4d0584da030e" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.807103 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lf8px"] Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.809994 4666 scope.go:117] "RemoveContainer" containerID="1e21f23cf638b75ab1c0ae740ffb922dfd1e4eab922a4e30f0ba0e96b43c5f69" Dec 03 12:50:15 crc kubenswrapper[4666]: I1203 12:50:15.820812 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-lf8px"] Dec 03 12:50:16 crc kubenswrapper[4666]: I1203 12:50:16.763196 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" event={"ID":"30859df6-46bb-4671-a9e9-def5132425af","Type":"ContainerStarted","Data":"4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823"} Dec 03 12:50:16 crc kubenswrapper[4666]: I1203 12:50:16.763408 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:16 crc kubenswrapper[4666]: I1203 12:50:16.787483 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" podStartSLOduration=8.787466062 podStartE2EDuration="8.787466062s" podCreationTimestamp="2025-12-03 12:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:16.781554115 +0000 UTC m=+2205.626515186" watchObservedRunningTime="2025-12-03 12:50:16.787466062 +0000 UTC m=+2205.632427113" Dec 03 12:50:17 crc kubenswrapper[4666]: I1203 12:50:17.434668 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" path="/var/lib/kubelet/pods/c2ace9fc-00fe-4ec8-9f86-770e476d30e8/volumes" Dec 03 12:50:23 crc kubenswrapper[4666]: I1203 12:50:23.945316 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.023686 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2h4fj"] Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.024008 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" podUID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerName="dnsmasq-dns" containerID="cri-o://5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d" gracePeriod=10 Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.484059 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.511468 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-dns-svc\") pod \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.511896 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-nb\") pod \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.511954 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-sb\") pod \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.511982 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd68c\" (UniqueName: \"kubernetes.io/projected/d4b759eb-b3db-4d3e-87a3-3098c96f2273-kube-api-access-pd68c\") pod \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.512028 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-openstack-edpm-ipam\") pod \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.512070 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-config\") pod \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\" (UID: \"d4b759eb-b3db-4d3e-87a3-3098c96f2273\") " Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.527375 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b759eb-b3db-4d3e-87a3-3098c96f2273-kube-api-access-pd68c" (OuterVolumeSpecName: "kube-api-access-pd68c") pod "d4b759eb-b3db-4d3e-87a3-3098c96f2273" (UID: "d4b759eb-b3db-4d3e-87a3-3098c96f2273"). InnerVolumeSpecName "kube-api-access-pd68c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.565831 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4b759eb-b3db-4d3e-87a3-3098c96f2273" (UID: "d4b759eb-b3db-4d3e-87a3-3098c96f2273"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.583462 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4b759eb-b3db-4d3e-87a3-3098c96f2273" (UID: "d4b759eb-b3db-4d3e-87a3-3098c96f2273"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.584626 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4b759eb-b3db-4d3e-87a3-3098c96f2273" (UID: "d4b759eb-b3db-4d3e-87a3-3098c96f2273"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.585520 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-config" (OuterVolumeSpecName: "config") pod "d4b759eb-b3db-4d3e-87a3-3098c96f2273" (UID: "d4b759eb-b3db-4d3e-87a3-3098c96f2273"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.588986 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d4b759eb-b3db-4d3e-87a3-3098c96f2273" (UID: "d4b759eb-b3db-4d3e-87a3-3098c96f2273"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.613751 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.613805 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.613823 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.613836 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd68c\" (UniqueName: \"kubernetes.io/projected/d4b759eb-b3db-4d3e-87a3-3098c96f2273-kube-api-access-pd68c\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.613862 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.613877 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b759eb-b3db-4d3e-87a3-3098c96f2273-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.899638 4666 generic.go:334] "Generic (PLEG): container finished" podID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerID="5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d" exitCode=0 Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.899689 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" event={"ID":"d4b759eb-b3db-4d3e-87a3-3098c96f2273","Type":"ContainerDied","Data":"5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d"} Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.899719 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" event={"ID":"d4b759eb-b3db-4d3e-87a3-3098c96f2273","Type":"ContainerDied","Data":"53a035dbc099cf88b5ebf1a755804490fcedbc645390bddc7596c7930b34bbf6"} Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.899737 4666 scope.go:117] "RemoveContainer" containerID="5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.899926 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2h4fj" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.922239 4666 scope.go:117] "RemoveContainer" containerID="bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.942361 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2h4fj"] Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.957867 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2h4fj"] Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.979165 4666 scope.go:117] "RemoveContainer" containerID="5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d" Dec 03 12:50:24 crc kubenswrapper[4666]: E1203 12:50:24.979767 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d\": container with ID starting with 5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d not found: ID does not exist" containerID="5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.979819 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d"} err="failed to get container status \"5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d\": rpc error: code = NotFound desc = could not find container \"5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d\": container with ID starting with 5fe7dc1d43d7a261c060b50ef265b007c145f90d7e7b00ec7876c116a8cdc06d not found: ID does not exist" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.979849 4666 scope.go:117] "RemoveContainer" containerID="bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad" Dec 03 12:50:24 crc kubenswrapper[4666]: E1203 12:50:24.980288 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad\": container with ID starting with bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad not found: ID does not exist" containerID="bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad" Dec 03 12:50:24 crc kubenswrapper[4666]: I1203 12:50:24.980334 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad"} err="failed to get container status \"bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad\": rpc error: code = NotFound desc = could not find container \"bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad\": container with ID starting with bdf2ccb668d29a016f81fb491b6a40b6422aac8ee66038175d49c6763b6dbaad not found: ID does not exist" Dec 03 12:50:25 crc kubenswrapper[4666]: I1203 12:50:25.433539 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" path="/var/lib/kubelet/pods/d4b759eb-b3db-4d3e-87a3-3098c96f2273/volumes" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.119647 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4"] Dec 03 12:50:29 crc kubenswrapper[4666]: E1203 12:50:29.120587 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerName="dnsmasq-dns" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.120610 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerName="dnsmasq-dns" Dec 03 12:50:29 crc kubenswrapper[4666]: E1203 12:50:29.120663 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerName="init" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.120675 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerName="init" Dec 03 12:50:29 crc kubenswrapper[4666]: E1203 12:50:29.120695 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerName="dnsmasq-dns" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.120708 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerName="dnsmasq-dns" Dec 03 12:50:29 crc kubenswrapper[4666]: E1203 12:50:29.120730 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerName="init" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.120741 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerName="init" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.121004 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ace9fc-00fe-4ec8-9f86-770e476d30e8" containerName="dnsmasq-dns" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.121042 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b759eb-b3db-4d3e-87a3-3098c96f2273" containerName="dnsmasq-dns" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.121891 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.124373 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.124969 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.128499 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.128916 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.142675 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4"] Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.203607 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.203758 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f52xg\" (UniqueName: \"kubernetes.io/projected/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-kube-api-access-f52xg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.203848 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.203955 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.306015 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.306102 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f52xg\" (UniqueName: \"kubernetes.io/projected/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-kube-api-access-f52xg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.306162 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.306245 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.311795 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.312220 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.313387 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.329570 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f52xg\" (UniqueName: \"kubernetes.io/projected/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-kube-api-access-f52xg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.446230 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.948097 4666 generic.go:334] "Generic (PLEG): container finished" podID="20f2c961-32c5-4a6e-8d18-5296c889d4a3" containerID="6854770ab2dfa16a371954d5b885c5328bd5685757b261aae3b024de3c007ae5" exitCode=0 Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.948207 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20f2c961-32c5-4a6e-8d18-5296c889d4a3","Type":"ContainerDied","Data":"6854770ab2dfa16a371954d5b885c5328bd5685757b261aae3b024de3c007ae5"} Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.950863 4666 generic.go:334] "Generic (PLEG): container finished" podID="1a39a37c-b566-4726-8e2b-84be35262830" containerID="a1baaeaa7d76ef9c5d3239b1b13556f0760f02768082e631cb40ced7f66b5b91" exitCode=0 Dec 03 12:50:29 crc kubenswrapper[4666]: I1203 12:50:29.950882 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a39a37c-b566-4726-8e2b-84be35262830","Type":"ContainerDied","Data":"a1baaeaa7d76ef9c5d3239b1b13556f0760f02768082e631cb40ced7f66b5b91"} Dec 03 12:50:30 crc kubenswrapper[4666]: I1203 12:50:30.071173 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4"] Dec 03 12:50:30 crc kubenswrapper[4666]: I1203 12:50:30.964346 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a39a37c-b566-4726-8e2b-84be35262830","Type":"ContainerStarted","Data":"549320b3e3325ef149f977c5828af13e05d49ce416dbdb5bc1f12cf9dab915c5"} Dec 03 12:50:30 crc kubenswrapper[4666]: I1203 12:50:30.964836 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 12:50:30 crc kubenswrapper[4666]: I1203 12:50:30.973807 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" event={"ID":"c766de6f-79d7-4dc4-ae16-85a648bf8eb5","Type":"ContainerStarted","Data":"f1b8395608e04b6a0e2c33563e935cbb25ccac3710fdf1062635321d792f2ec4"} Dec 03 12:50:31 crc kubenswrapper[4666]: I1203 12:50:30.999872 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.999848534 podStartE2EDuration="36.999848534s" podCreationTimestamp="2025-12-03 12:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:30.987281812 +0000 UTC m=+2219.832242893" watchObservedRunningTime="2025-12-03 12:50:30.999848534 +0000 UTC m=+2219.844809585" Dec 03 12:50:31 crc kubenswrapper[4666]: I1203 12:50:31.002955 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20f2c961-32c5-4a6e-8d18-5296c889d4a3","Type":"ContainerStarted","Data":"8819fba21d86d124a11357965e773382915d2749679c693119b01d59dce1bdb4"} Dec 03 12:50:31 crc kubenswrapper[4666]: I1203 12:50:31.003389 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:50:31 crc kubenswrapper[4666]: I1203 12:50:31.056296 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.056277577 podStartE2EDuration="37.056277577s" podCreationTimestamp="2025-12-03 12:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:50:31.052144418 +0000 UTC m=+2219.897105489" watchObservedRunningTime="2025-12-03 12:50:31.056277577 +0000 UTC m=+2219.901238628" Dec 03 12:50:44 crc kubenswrapper[4666]: E1203 12:50:44.048165 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Dec 03 12:50:44 crc kubenswrapper[4666]: E1203 12:50:44.048940 4666 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 12:50:44 crc kubenswrapper[4666]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Dec 03 12:50:44 crc kubenswrapper[4666]: - hosts: all Dec 03 12:50:44 crc kubenswrapper[4666]: strategy: linear Dec 03 12:50:44 crc kubenswrapper[4666]: tasks: Dec 03 12:50:44 crc kubenswrapper[4666]: - name: Enable podified-repos Dec 03 12:50:44 crc kubenswrapper[4666]: become: true Dec 03 12:50:44 crc kubenswrapper[4666]: ansible.builtin.shell: | Dec 03 12:50:44 crc kubenswrapper[4666]: set -euxo pipefail Dec 03 12:50:44 crc kubenswrapper[4666]: pushd /var/tmp Dec 03 12:50:44 crc kubenswrapper[4666]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Dec 03 12:50:44 crc kubenswrapper[4666]: pushd repo-setup-main Dec 03 12:50:44 crc kubenswrapper[4666]: python3 -m venv ./venv Dec 03 12:50:44 crc kubenswrapper[4666]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Dec 03 12:50:44 crc kubenswrapper[4666]: ./venv/bin/repo-setup current-podified -b antelope Dec 03 12:50:44 crc kubenswrapper[4666]: popd Dec 03 12:50:44 crc kubenswrapper[4666]: rm -rf repo-setup-main Dec 03 12:50:44 crc kubenswrapper[4666]: Dec 03 12:50:44 crc kubenswrapper[4666]: Dec 03 12:50:44 crc kubenswrapper[4666]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Dec 03 12:50:44 crc kubenswrapper[4666]: edpm_override_hosts: openstack-edpm-ipam Dec 03 12:50:44 crc kubenswrapper[4666]: edpm_service_type: repo-setup Dec 03 12:50:44 crc kubenswrapper[4666]: Dec 03 12:50:44 crc kubenswrapper[4666]: Dec 03 12:50:44 crc kubenswrapper[4666]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/runner/env/ssh_key,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f52xg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4_openstack(c766de6f-79d7-4dc4-ae16-85a648bf8eb5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Dec 03 12:50:44 crc kubenswrapper[4666]: > logger="UnhandledError" Dec 03 12:50:44 crc kubenswrapper[4666]: E1203 12:50:44.050522 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" podUID="c766de6f-79d7-4dc4-ae16-85a648bf8eb5" Dec 03 12:50:44 crc kubenswrapper[4666]: E1203 12:50:44.173149 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" podUID="c766de6f-79d7-4dc4-ae16-85a648bf8eb5" Dec 03 12:50:44 crc kubenswrapper[4666]: I1203 12:50:44.507159 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1a39a37c-b566-4726-8e2b-84be35262830" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.189:5671: connect: connection refused" Dec 03 12:50:45 crc kubenswrapper[4666]: I1203 12:50:45.129372 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 12:50:47 crc kubenswrapper[4666]: I1203 12:50:47.414606 4666 scope.go:117] "RemoveContainer" containerID="a56ba1b8f1b37faccc51fcb13b9b2cf2532539e3bcb53b2c84c3dd7976bb74dd" Dec 03 12:50:54 crc kubenswrapper[4666]: I1203 12:50:54.459519 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 12:51:00 crc kubenswrapper[4666]: I1203 12:51:00.086773 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:51:00 crc kubenswrapper[4666]: I1203 12:51:00.347219 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" event={"ID":"c766de6f-79d7-4dc4-ae16-85a648bf8eb5","Type":"ContainerStarted","Data":"08fbb4a9e5509c6b63897c658c496ba120bbfa9281740ac9261c5e01cfdea84d"} Dec 03 12:51:00 crc kubenswrapper[4666]: I1203 12:51:00.374088 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" podStartSLOduration=1.370157847 podStartE2EDuration="31.374065635s" podCreationTimestamp="2025-12-03 12:50:29 +0000 UTC" firstStartedPulling="2025-12-03 12:50:30.080360941 +0000 UTC m=+2218.925322002" lastFinishedPulling="2025-12-03 12:51:00.084268739 +0000 UTC m=+2248.929229790" observedRunningTime="2025-12-03 12:51:00.365822157 +0000 UTC m=+2249.210783248" watchObservedRunningTime="2025-12-03 12:51:00.374065635 +0000 UTC m=+2249.219026696" Dec 03 12:51:13 crc kubenswrapper[4666]: I1203 12:51:13.475062 4666 generic.go:334] "Generic (PLEG): container finished" podID="c766de6f-79d7-4dc4-ae16-85a648bf8eb5" containerID="08fbb4a9e5509c6b63897c658c496ba120bbfa9281740ac9261c5e01cfdea84d" exitCode=0 Dec 03 12:51:13 crc kubenswrapper[4666]: I1203 12:51:13.475130 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" event={"ID":"c766de6f-79d7-4dc4-ae16-85a648bf8eb5","Type":"ContainerDied","Data":"08fbb4a9e5509c6b63897c658c496ba120bbfa9281740ac9261c5e01cfdea84d"} Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.819198 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.910680 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f52xg\" (UniqueName: \"kubernetes.io/projected/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-kube-api-access-f52xg\") pod \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.910825 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-ssh-key\") pod \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.910883 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-repo-setup-combined-ca-bundle\") pod \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.910922 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-inventory\") pod \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\" (UID: \"c766de6f-79d7-4dc4-ae16-85a648bf8eb5\") " Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.916497 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c766de6f-79d7-4dc4-ae16-85a648bf8eb5" (UID: "c766de6f-79d7-4dc4-ae16-85a648bf8eb5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.916724 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-kube-api-access-f52xg" (OuterVolumeSpecName: "kube-api-access-f52xg") pod "c766de6f-79d7-4dc4-ae16-85a648bf8eb5" (UID: "c766de6f-79d7-4dc4-ae16-85a648bf8eb5"). InnerVolumeSpecName "kube-api-access-f52xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.937377 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c766de6f-79d7-4dc4-ae16-85a648bf8eb5" (UID: "c766de6f-79d7-4dc4-ae16-85a648bf8eb5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:14 crc kubenswrapper[4666]: I1203 12:51:14.937615 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-inventory" (OuterVolumeSpecName: "inventory") pod "c766de6f-79d7-4dc4-ae16-85a648bf8eb5" (UID: "c766de6f-79d7-4dc4-ae16-85a648bf8eb5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.013267 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.013299 4666 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.013312 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.013322 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f52xg\" (UniqueName: \"kubernetes.io/projected/c766de6f-79d7-4dc4-ae16-85a648bf8eb5-kube-api-access-f52xg\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.495991 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" event={"ID":"c766de6f-79d7-4dc4-ae16-85a648bf8eb5","Type":"ContainerDied","Data":"f1b8395608e04b6a0e2c33563e935cbb25ccac3710fdf1062635321d792f2ec4"} Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.496386 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b8395608e04b6a0e2c33563e935cbb25ccac3710fdf1062635321d792f2ec4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.496133 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.585923 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4"] Dec 03 12:51:15 crc kubenswrapper[4666]: E1203 12:51:15.586312 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c766de6f-79d7-4dc4-ae16-85a648bf8eb5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.586333 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c766de6f-79d7-4dc4-ae16-85a648bf8eb5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.586512 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="c766de6f-79d7-4dc4-ae16-85a648bf8eb5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.587055 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.589195 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.591118 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.591251 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.592895 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.601858 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4"] Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.735643 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/4fe8a0fb-f506-4363-9006-23cd005d0e78-kube-api-access-qdbrz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.735929 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.736055 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.736385 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.838496 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/4fe8a0fb-f506-4363-9006-23cd005d0e78-kube-api-access-qdbrz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.838757 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.838811 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.838871 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.842525 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.842789 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.843610 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.856459 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/4fe8a0fb-f506-4363-9006-23cd005d0e78-kube-api-access-qdbrz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:15 crc kubenswrapper[4666]: I1203 12:51:15.952867 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:51:16 crc kubenswrapper[4666]: I1203 12:51:16.469047 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4"] Dec 03 12:51:16 crc kubenswrapper[4666]: I1203 12:51:16.506145 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" event={"ID":"4fe8a0fb-f506-4363-9006-23cd005d0e78","Type":"ContainerStarted","Data":"b68f78b7c56e84ce7d6e35c9046551fd915510c084269fc6f795dae5eaa17415"} Dec 03 12:51:17 crc kubenswrapper[4666]: I1203 12:51:17.516853 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" event={"ID":"4fe8a0fb-f506-4363-9006-23cd005d0e78","Type":"ContainerStarted","Data":"f54d0a707f5883191ad902c738892e2be8a3fdee4bb26dd9be042abc0741dfde"} Dec 03 12:51:17 crc kubenswrapper[4666]: I1203 12:51:17.541253 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" podStartSLOduration=2.157898 podStartE2EDuration="2.541228391s" podCreationTimestamp="2025-12-03 12:51:15 +0000 UTC" firstStartedPulling="2025-12-03 12:51:16.472401787 +0000 UTC m=+2265.317362848" lastFinishedPulling="2025-12-03 12:51:16.855732188 +0000 UTC m=+2265.700693239" observedRunningTime="2025-12-03 12:51:17.531898255 +0000 UTC m=+2266.376859326" watchObservedRunningTime="2025-12-03 12:51:17.541228391 +0000 UTC m=+2266.386189442" Dec 03 12:51:47 crc kubenswrapper[4666]: I1203 12:51:47.549811 4666 scope.go:117] "RemoveContainer" containerID="f301d3cd2c534cc6662de316381561e2d6de6a79f6fbd6a660f684cda67c58e4" Dec 03 12:52:39 crc kubenswrapper[4666]: I1203 12:52:39.866744 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:52:39 crc kubenswrapper[4666]: I1203 12:52:39.867266 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:53:09 crc kubenswrapper[4666]: I1203 12:53:09.867919 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:53:09 crc kubenswrapper[4666]: I1203 12:53:09.868647 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:53:39 crc kubenswrapper[4666]: I1203 12:53:39.866889 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:53:39 crc kubenswrapper[4666]: I1203 12:53:39.867528 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:53:39 crc kubenswrapper[4666]: I1203 12:53:39.867596 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 12:53:39 crc kubenswrapper[4666]: I1203 12:53:39.868745 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:53:39 crc kubenswrapper[4666]: I1203 12:53:39.868850 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" gracePeriod=600 Dec 03 12:53:40 crc kubenswrapper[4666]: E1203 12:53:40.532067 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:53:40 crc kubenswrapper[4666]: I1203 12:53:40.877193 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" exitCode=0 Dec 03 12:53:40 crc kubenswrapper[4666]: I1203 12:53:40.877272 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878"} Dec 03 12:53:40 crc kubenswrapper[4666]: I1203 12:53:40.877514 4666 scope.go:117] "RemoveContainer" containerID="baccbee2e06aeeb4d11e51e541c48984c3da2edee76ae78a953b9be250435078" Dec 03 12:53:40 crc kubenswrapper[4666]: I1203 12:53:40.878020 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:53:40 crc kubenswrapper[4666]: E1203 12:53:40.878283 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:53:47 crc kubenswrapper[4666]: I1203 12:53:47.642426 4666 scope.go:117] "RemoveContainer" containerID="aa1513804e384e3ed08fa1815f323e5f07ea71b4ae1db8b651019fe08f17718a" Dec 03 12:53:47 crc kubenswrapper[4666]: I1203 12:53:47.675776 4666 scope.go:117] "RemoveContainer" containerID="1724b4cce6e821b8de73dc7074f9b90ddd03803628e7914a2bd8d94ac6fe276a" Dec 03 12:53:47 crc kubenswrapper[4666]: I1203 12:53:47.693327 4666 scope.go:117] "RemoveContainer" containerID="446093b16b60891f0f489fc174a7115f00bdb51520a56fc9f7d6622581f67813" Dec 03 12:53:47 crc kubenswrapper[4666]: I1203 12:53:47.712322 4666 scope.go:117] "RemoveContainer" containerID="8d12ec2871170eafef694b63ac74ecf14896e1bd7a35e7e5fbd044ada6fb0320" Dec 03 12:53:52 crc kubenswrapper[4666]: I1203 12:53:52.424755 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:53:52 crc kubenswrapper[4666]: E1203 12:53:52.425614 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:54:03 crc kubenswrapper[4666]: I1203 12:54:03.423817 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:54:03 crc kubenswrapper[4666]: E1203 12:54:03.424614 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:54:16 crc kubenswrapper[4666]: I1203 12:54:16.423583 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:54:16 crc kubenswrapper[4666]: E1203 12:54:16.424522 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:54:29 crc kubenswrapper[4666]: I1203 12:54:29.424361 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:54:29 crc kubenswrapper[4666]: E1203 12:54:29.425158 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:54:40 crc kubenswrapper[4666]: I1203 12:54:40.424410 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:54:40 crc kubenswrapper[4666]: E1203 12:54:40.427680 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:54:41 crc kubenswrapper[4666]: I1203 12:54:41.058821 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-r79jt"] Dec 03 12:54:41 crc kubenswrapper[4666]: I1203 12:54:41.071393 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f6df-account-create-update-rnxbk"] Dec 03 12:54:41 crc kubenswrapper[4666]: I1203 12:54:41.080466 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-r79jt"] Dec 03 12:54:41 crc kubenswrapper[4666]: I1203 12:54:41.091313 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f6df-account-create-update-rnxbk"] Dec 03 12:54:41 crc kubenswrapper[4666]: I1203 12:54:41.433478 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf41bd6e-97a2-4fe8-abab-ef09475a7c9b" path="/var/lib/kubelet/pods/cf41bd6e-97a2-4fe8-abab-ef09475a7c9b/volumes" Dec 03 12:54:41 crc kubenswrapper[4666]: I1203 12:54:41.434494 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb83348-41a7-482e-88e1-270211aacb35" path="/var/lib/kubelet/pods/efb83348-41a7-482e-88e1-270211aacb35/volumes" Dec 03 12:54:46 crc kubenswrapper[4666]: I1203 12:54:46.046578 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-f7mlv"] Dec 03 12:54:46 crc kubenswrapper[4666]: I1203 12:54:46.060767 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0ae4-account-create-update-hd4gw"] Dec 03 12:54:46 crc kubenswrapper[4666]: I1203 12:54:46.072260 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-f7mlv"] Dec 03 12:54:46 crc kubenswrapper[4666]: I1203 12:54:46.082918 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0ae4-account-create-update-hd4gw"] Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.027573 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zzrnb"] Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.035829 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d9a7-account-create-update-lqv8d"] Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.045344 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zzrnb"] Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.054470 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d9a7-account-create-update-lqv8d"] Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.433552 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42919b07-c6bf-47a5-8bc9-973574afd913" path="/var/lib/kubelet/pods/42919b07-c6bf-47a5-8bc9-973574afd913/volumes" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.434438 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec08bfb-edbf-4793-9d37-96f6e0e766d8" path="/var/lib/kubelet/pods/8ec08bfb-edbf-4793-9d37-96f6e0e766d8/volumes" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.435378 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33a6a53-cf96-4f2e-bc74-5aad83b6298e" path="/var/lib/kubelet/pods/c33a6a53-cf96-4f2e-bc74-5aad83b6298e/volumes" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.436141 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7cb48a9-dd01-4708-89e7-cc5702003f99" path="/var/lib/kubelet/pods/d7cb48a9-dd01-4708-89e7-cc5702003f99/volumes" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.787437 4666 scope.go:117] "RemoveContainer" containerID="f8656f33884cc910982e58a86c1461e8b46a48f97ca902fed2f4a4ee012603f5" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.806796 4666 scope.go:117] "RemoveContainer" containerID="472bbcae1548c2ba82203d661e520d6432edac8df601cb98e5b01358ba34a357" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.826354 4666 scope.go:117] "RemoveContainer" containerID="e502eef6f2b687e09154cac1cee14a1bd5bf862b71d4ffab7e8fc9b882ed81bd" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.867595 4666 scope.go:117] "RemoveContainer" containerID="eb20dcfaca0ad0da20fecc906c946dba8c8305a697ef8852b24c83b64f312bb0" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.903400 4666 scope.go:117] "RemoveContainer" containerID="9487932252ea3eea33b5edcb63b0f403cfd6d3109575eb0752b4e7824ead0f04" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.919617 4666 scope.go:117] "RemoveContainer" containerID="4ce2428ee107d8d3fa7a280a073c2e51cf79a9381a63d235c93dd738a1da2f18" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.961749 4666 scope.go:117] "RemoveContainer" containerID="81f514f62bbee44890c51713efc64ca9e1440e88210594251a57709dc0fa07e9" Dec 03 12:54:47 crc kubenswrapper[4666]: I1203 12:54:47.994332 4666 scope.go:117] "RemoveContainer" containerID="130a10b959f51f0912a9673228e9dc34a296ed444a01a11fa8a2040679804694" Dec 03 12:54:48 crc kubenswrapper[4666]: I1203 12:54:48.015894 4666 scope.go:117] "RemoveContainer" containerID="0254f3f6217317ca9cd0c26a7ea9ebb701d8560dacccbd22066574bc43d4942b" Dec 03 12:54:53 crc kubenswrapper[4666]: I1203 12:54:53.509457 4666 generic.go:334] "Generic (PLEG): container finished" podID="4fe8a0fb-f506-4363-9006-23cd005d0e78" containerID="f54d0a707f5883191ad902c738892e2be8a3fdee4bb26dd9be042abc0741dfde" exitCode=0 Dec 03 12:54:53 crc kubenswrapper[4666]: I1203 12:54:53.509544 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" event={"ID":"4fe8a0fb-f506-4363-9006-23cd005d0e78","Type":"ContainerDied","Data":"f54d0a707f5883191ad902c738892e2be8a3fdee4bb26dd9be042abc0741dfde"} Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.910764 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.927297 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/4fe8a0fb-f506-4363-9006-23cd005d0e78-kube-api-access-qdbrz\") pod \"4fe8a0fb-f506-4363-9006-23cd005d0e78\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.927772 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-bootstrap-combined-ca-bundle\") pod \"4fe8a0fb-f506-4363-9006-23cd005d0e78\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.927843 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-ssh-key\") pod \"4fe8a0fb-f506-4363-9006-23cd005d0e78\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.928129 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-inventory\") pod \"4fe8a0fb-f506-4363-9006-23cd005d0e78\" (UID: \"4fe8a0fb-f506-4363-9006-23cd005d0e78\") " Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.933037 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4fe8a0fb-f506-4363-9006-23cd005d0e78" (UID: "4fe8a0fb-f506-4363-9006-23cd005d0e78"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.935427 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe8a0fb-f506-4363-9006-23cd005d0e78-kube-api-access-qdbrz" (OuterVolumeSpecName: "kube-api-access-qdbrz") pod "4fe8a0fb-f506-4363-9006-23cd005d0e78" (UID: "4fe8a0fb-f506-4363-9006-23cd005d0e78"). InnerVolumeSpecName "kube-api-access-qdbrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.953201 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-inventory" (OuterVolumeSpecName: "inventory") pod "4fe8a0fb-f506-4363-9006-23cd005d0e78" (UID: "4fe8a0fb-f506-4363-9006-23cd005d0e78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:54:54 crc kubenswrapper[4666]: I1203 12:54:54.953572 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4fe8a0fb-f506-4363-9006-23cd005d0e78" (UID: "4fe8a0fb-f506-4363-9006-23cd005d0e78"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.030856 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.030899 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/4fe8a0fb-f506-4363-9006-23cd005d0e78-kube-api-access-qdbrz\") on node \"crc\" DevicePath \"\"" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.030916 4666 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.030927 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fe8a0fb-f506-4363-9006-23cd005d0e78-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.423752 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:54:55 crc kubenswrapper[4666]: E1203 12:54:55.424161 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.526968 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" event={"ID":"4fe8a0fb-f506-4363-9006-23cd005d0e78","Type":"ContainerDied","Data":"b68f78b7c56e84ce7d6e35c9046551fd915510c084269fc6f795dae5eaa17415"} Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.527208 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b68f78b7c56e84ce7d6e35c9046551fd915510c084269fc6f795dae5eaa17415" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.527164 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.596937 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m"] Dec 03 12:54:55 crc kubenswrapper[4666]: E1203 12:54:55.597326 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe8a0fb-f506-4363-9006-23cd005d0e78" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.597342 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe8a0fb-f506-4363-9006-23cd005d0e78" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.597528 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe8a0fb-f506-4363-9006-23cd005d0e78" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.598104 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.600253 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.600321 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.601151 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.601698 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.609212 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m"] Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.643920 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g259m\" (UniqueName: \"kubernetes.io/projected/86cef496-ab56-4162-b028-9d05332eb53c-kube-api-access-g259m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.644065 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.644164 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.745362 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g259m\" (UniqueName: \"kubernetes.io/projected/86cef496-ab56-4162-b028-9d05332eb53c-kube-api-access-g259m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.745460 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.745526 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.749282 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.749284 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.763047 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g259m\" (UniqueName: \"kubernetes.io/projected/86cef496-ab56-4162-b028-9d05332eb53c-kube-api-access-g259m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mt54m\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:55 crc kubenswrapper[4666]: I1203 12:54:55.922953 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:54:56 crc kubenswrapper[4666]: I1203 12:54:56.439810 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m"] Dec 03 12:54:56 crc kubenswrapper[4666]: I1203 12:54:56.443121 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:54:56 crc kubenswrapper[4666]: I1203 12:54:56.534198 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" event={"ID":"86cef496-ab56-4162-b028-9d05332eb53c","Type":"ContainerStarted","Data":"732d5ebf23490b95dbe657289979714d672511e3be6fc6772d37eaf08596894b"} Dec 03 12:54:58 crc kubenswrapper[4666]: I1203 12:54:58.551628 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" event={"ID":"86cef496-ab56-4162-b028-9d05332eb53c","Type":"ContainerStarted","Data":"d35d04e69f59b062d1eb964215373f2832a7c45275f47e6642d6aa0a30579199"} Dec 03 12:54:58 crc kubenswrapper[4666]: I1203 12:54:58.576108 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" podStartSLOduration=2.355759751 podStartE2EDuration="3.576077416s" podCreationTimestamp="2025-12-03 12:54:55 +0000 UTC" firstStartedPulling="2025-12-03 12:54:56.442877438 +0000 UTC m=+2485.287838489" lastFinishedPulling="2025-12-03 12:54:57.663195103 +0000 UTC m=+2486.508156154" observedRunningTime="2025-12-03 12:54:58.569147839 +0000 UTC m=+2487.414108890" watchObservedRunningTime="2025-12-03 12:54:58.576077416 +0000 UTC m=+2487.421038467" Dec 03 12:55:10 crc kubenswrapper[4666]: I1203 12:55:10.423965 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:55:10 crc kubenswrapper[4666]: E1203 12:55:10.425339 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.050571 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-88bf-account-create-update-t9qsf"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.061316 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-88bf-account-create-update-t9qsf"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.072673 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1a6f-account-create-update-gr78s"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.083012 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vzd52"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.092979 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-22jn4"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.100506 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vzd52"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.107449 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1a6f-account-create-update-gr78s"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.115261 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-22jn4"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.122429 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0d69-account-create-update-bhx5p"] Dec 03 12:55:14 crc kubenswrapper[4666]: I1203 12:55:14.129338 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0d69-account-create-update-bhx5p"] Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.028489 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jjp74"] Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.038225 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7ljg8"] Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.046621 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7ljg8"] Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.053437 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jjp74"] Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.443315 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0e6849-2521-4df6-b2f2-667769034675" path="/var/lib/kubelet/pods/0a0e6849-2521-4df6-b2f2-667769034675/volumes" Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.444849 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9f6c9f-cd12-43d6-b79b-240db68c0e88" path="/var/lib/kubelet/pods/6b9f6c9f-cd12-43d6-b79b-240db68c0e88/volumes" Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.446239 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823d7744-f03e-4ed5-b16b-823cf42f9084" path="/var/lib/kubelet/pods/823d7744-f03e-4ed5-b16b-823cf42f9084/volumes" Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.447554 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="979de73a-2c31-41fa-aeb4-fab22feeacc0" path="/var/lib/kubelet/pods/979de73a-2c31-41fa-aeb4-fab22feeacc0/volumes" Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.450735 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae6d2ff-7625-402d-8b3d-ac215f993f2c" path="/var/lib/kubelet/pods/aae6d2ff-7625-402d-8b3d-ac215f993f2c/volumes" Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.452788 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22dc9c8-5134-46ec-ba0f-c2b334490035" path="/var/lib/kubelet/pods/d22dc9c8-5134-46ec-ba0f-c2b334490035/volumes" Dec 03 12:55:15 crc kubenswrapper[4666]: I1203 12:55:15.455320 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d678da1e-e438-4e63-99e4-8eb737a077f5" path="/var/lib/kubelet/pods/d678da1e-e438-4e63-99e4-8eb737a077f5/volumes" Dec 03 12:55:19 crc kubenswrapper[4666]: I1203 12:55:19.039795 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-72mb9"] Dec 03 12:55:19 crc kubenswrapper[4666]: I1203 12:55:19.046638 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-72mb9"] Dec 03 12:55:19 crc kubenswrapper[4666]: I1203 12:55:19.435176 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1648693-9c33-4c6b-94d9-fa47ea5b38d4" path="/var/lib/kubelet/pods/b1648693-9c33-4c6b-94d9-fa47ea5b38d4/volumes" Dec 03 12:55:24 crc kubenswrapper[4666]: I1203 12:55:24.432742 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:55:24 crc kubenswrapper[4666]: E1203 12:55:24.438047 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:55:37 crc kubenswrapper[4666]: I1203 12:55:37.423219 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:55:37 crc kubenswrapper[4666]: E1203 12:55:37.424367 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.176637 4666 scope.go:117] "RemoveContainer" containerID="0208e379195b8d165c9277f636ade29dd399cd283bfd12c5369ec8b6e422b130" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.200034 4666 scope.go:117] "RemoveContainer" containerID="f9f7c4ce5bd86704f5384d052989d2abc2521f44ae6afa2cc63324931b06015c" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.239723 4666 scope.go:117] "RemoveContainer" containerID="a1515810d3ef40520c3156b32c4ca9d6541943ef1ffbe086c4e25eafcb552d18" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.276077 4666 scope.go:117] "RemoveContainer" containerID="7206ef27d76935440c78e0761162410caad11f5f1cec71ff05893027332a49ec" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.316631 4666 scope.go:117] "RemoveContainer" containerID="5978737f84e7edf423b566c03c708dd64dd1d9993effc3fbc93dd0b9cb2343b0" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.364314 4666 scope.go:117] "RemoveContainer" containerID="9c0452d5372f8f01316d5ffb31e90b9081b1e9097324e4540db18c9249024106" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.400001 4666 scope.go:117] "RemoveContainer" containerID="0b47f7cee016ee1af5123f69f109c02adc62dccb78c897fbd568bcf9de53c150" Dec 03 12:55:48 crc kubenswrapper[4666]: I1203 12:55:48.433279 4666 scope.go:117] "RemoveContainer" containerID="cb34e89046aafa174a2aa065c35eafc36135635fa742ea558e1e42c0090e754d" Dec 03 12:55:52 crc kubenswrapper[4666]: I1203 12:55:52.423718 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:55:52 crc kubenswrapper[4666]: E1203 12:55:52.424640 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:55:53 crc kubenswrapper[4666]: I1203 12:55:53.051427 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-v65m9"] Dec 03 12:55:53 crc kubenswrapper[4666]: I1203 12:55:53.062978 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2lngb"] Dec 03 12:55:53 crc kubenswrapper[4666]: I1203 12:55:53.075045 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-v65m9"] Dec 03 12:55:53 crc kubenswrapper[4666]: I1203 12:55:53.082421 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2lngb"] Dec 03 12:55:53 crc kubenswrapper[4666]: I1203 12:55:53.444724 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f8d025-21f1-4e23-9e6a-75cf2202e447" path="/var/lib/kubelet/pods/17f8d025-21f1-4e23-9e6a-75cf2202e447/volumes" Dec 03 12:55:53 crc kubenswrapper[4666]: I1203 12:55:53.446475 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd8e823-c02f-4841-a07a-a1e7fa64f75e" path="/var/lib/kubelet/pods/4bd8e823-c02f-4841-a07a-a1e7fa64f75e/volumes" Dec 03 12:55:57 crc kubenswrapper[4666]: I1203 12:55:57.025888 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cnrgx"] Dec 03 12:55:57 crc kubenswrapper[4666]: I1203 12:55:57.035766 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cnrgx"] Dec 03 12:55:57 crc kubenswrapper[4666]: I1203 12:55:57.434203 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc7cb74-0c27-4786-93fc-31c0e3a565b7" path="/var/lib/kubelet/pods/3fc7cb74-0c27-4786-93fc-31c0e3a565b7/volumes" Dec 03 12:56:07 crc kubenswrapper[4666]: I1203 12:56:07.431377 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:56:07 crc kubenswrapper[4666]: E1203 12:56:07.432593 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:56:08 crc kubenswrapper[4666]: I1203 12:56:08.048322 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-g4zjc"] Dec 03 12:56:08 crc kubenswrapper[4666]: I1203 12:56:08.057209 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-g4zjc"] Dec 03 12:56:09 crc kubenswrapper[4666]: I1203 12:56:09.443771 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac4b252-1187-43c2-bfdd-0d48db4ff9e8" path="/var/lib/kubelet/pods/7ac4b252-1187-43c2-bfdd-0d48db4ff9e8/volumes" Dec 03 12:56:16 crc kubenswrapper[4666]: I1203 12:56:16.242791 4666 generic.go:334] "Generic (PLEG): container finished" podID="86cef496-ab56-4162-b028-9d05332eb53c" containerID="d35d04e69f59b062d1eb964215373f2832a7c45275f47e6642d6aa0a30579199" exitCode=0 Dec 03 12:56:16 crc kubenswrapper[4666]: I1203 12:56:16.242888 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" event={"ID":"86cef496-ab56-4162-b028-9d05332eb53c","Type":"ContainerDied","Data":"d35d04e69f59b062d1eb964215373f2832a7c45275f47e6642d6aa0a30579199"} Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.615988 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.745505 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-inventory\") pod \"86cef496-ab56-4162-b028-9d05332eb53c\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.745594 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g259m\" (UniqueName: \"kubernetes.io/projected/86cef496-ab56-4162-b028-9d05332eb53c-kube-api-access-g259m\") pod \"86cef496-ab56-4162-b028-9d05332eb53c\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.745714 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-ssh-key\") pod \"86cef496-ab56-4162-b028-9d05332eb53c\" (UID: \"86cef496-ab56-4162-b028-9d05332eb53c\") " Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.750551 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86cef496-ab56-4162-b028-9d05332eb53c-kube-api-access-g259m" (OuterVolumeSpecName: "kube-api-access-g259m") pod "86cef496-ab56-4162-b028-9d05332eb53c" (UID: "86cef496-ab56-4162-b028-9d05332eb53c"). InnerVolumeSpecName "kube-api-access-g259m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.775908 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "86cef496-ab56-4162-b028-9d05332eb53c" (UID: "86cef496-ab56-4162-b028-9d05332eb53c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.778569 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-inventory" (OuterVolumeSpecName: "inventory") pod "86cef496-ab56-4162-b028-9d05332eb53c" (UID: "86cef496-ab56-4162-b028-9d05332eb53c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.847281 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.847312 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86cef496-ab56-4162-b028-9d05332eb53c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:17 crc kubenswrapper[4666]: I1203 12:56:17.847322 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g259m\" (UniqueName: \"kubernetes.io/projected/86cef496-ab56-4162-b028-9d05332eb53c-kube-api-access-g259m\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.259811 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" event={"ID":"86cef496-ab56-4162-b028-9d05332eb53c","Type":"ContainerDied","Data":"732d5ebf23490b95dbe657289979714d672511e3be6fc6772d37eaf08596894b"} Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.259852 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732d5ebf23490b95dbe657289979714d672511e3be6fc6772d37eaf08596894b" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.259949 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.340625 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv"] Dec 03 12:56:18 crc kubenswrapper[4666]: E1203 12:56:18.341312 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86cef496-ab56-4162-b028-9d05332eb53c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.341334 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cef496-ab56-4162-b028-9d05332eb53c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.341600 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="86cef496-ab56-4162-b028-9d05332eb53c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.342325 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.344792 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.345194 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.345384 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.345582 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.356434 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv"] Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.460834 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44dkw\" (UniqueName: \"kubernetes.io/projected/f18b806a-b9ab-4edd-83df-d8298d026d6e-kube-api-access-44dkw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.460931 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.461105 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.563232 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44dkw\" (UniqueName: \"kubernetes.io/projected/f18b806a-b9ab-4edd-83df-d8298d026d6e-kube-api-access-44dkw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.563421 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.563511 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.568981 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.569038 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.590555 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44dkw\" (UniqueName: \"kubernetes.io/projected/f18b806a-b9ab-4edd-83df-d8298d026d6e-kube-api-access-44dkw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bjshv\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:18 crc kubenswrapper[4666]: I1203 12:56:18.670157 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:19 crc kubenswrapper[4666]: I1203 12:56:19.258543 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv"] Dec 03 12:56:20 crc kubenswrapper[4666]: I1203 12:56:20.282493 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" event={"ID":"f18b806a-b9ab-4edd-83df-d8298d026d6e","Type":"ContainerStarted","Data":"4b43bfd83e0f2efba46abee6c4cb646a1157a7edc5c6c6e64f0a5f55fe18cbdc"} Dec 03 12:56:20 crc kubenswrapper[4666]: I1203 12:56:20.282806 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" event={"ID":"f18b806a-b9ab-4edd-83df-d8298d026d6e","Type":"ContainerStarted","Data":"94e260301aa5e6c2a1fc95a2f9894b165654bb201c833b703dae699ec0cfe921"} Dec 03 12:56:20 crc kubenswrapper[4666]: I1203 12:56:20.307591 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" podStartSLOduration=1.854572583 podStartE2EDuration="2.307570342s" podCreationTimestamp="2025-12-03 12:56:18 +0000 UTC" firstStartedPulling="2025-12-03 12:56:19.268197276 +0000 UTC m=+2568.113158327" lastFinishedPulling="2025-12-03 12:56:19.721195045 +0000 UTC m=+2568.566156086" observedRunningTime="2025-12-03 12:56:20.297490791 +0000 UTC m=+2569.142451852" watchObservedRunningTime="2025-12-03 12:56:20.307570342 +0000 UTC m=+2569.152531403" Dec 03 12:56:22 crc kubenswrapper[4666]: I1203 12:56:22.423853 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:56:22 crc kubenswrapper[4666]: E1203 12:56:22.424446 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:56:24 crc kubenswrapper[4666]: I1203 12:56:24.030722 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vrrxv"] Dec 03 12:56:24 crc kubenswrapper[4666]: I1203 12:56:24.038289 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vrrxv"] Dec 03 12:56:25 crc kubenswrapper[4666]: I1203 12:56:25.326575 4666 generic.go:334] "Generic (PLEG): container finished" podID="f18b806a-b9ab-4edd-83df-d8298d026d6e" containerID="4b43bfd83e0f2efba46abee6c4cb646a1157a7edc5c6c6e64f0a5f55fe18cbdc" exitCode=0 Dec 03 12:56:25 crc kubenswrapper[4666]: I1203 12:56:25.326634 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" event={"ID":"f18b806a-b9ab-4edd-83df-d8298d026d6e","Type":"ContainerDied","Data":"4b43bfd83e0f2efba46abee6c4cb646a1157a7edc5c6c6e64f0a5f55fe18cbdc"} Dec 03 12:56:25 crc kubenswrapper[4666]: I1203 12:56:25.440870 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aac4d80-7d0d-4037-a398-6a28ab35d1c9" path="/var/lib/kubelet/pods/9aac4d80-7d0d-4037-a398-6a28ab35d1c9/volumes" Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.725603 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.819301 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44dkw\" (UniqueName: \"kubernetes.io/projected/f18b806a-b9ab-4edd-83df-d8298d026d6e-kube-api-access-44dkw\") pod \"f18b806a-b9ab-4edd-83df-d8298d026d6e\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.821441 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-ssh-key\") pod \"f18b806a-b9ab-4edd-83df-d8298d026d6e\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.821552 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-inventory\") pod \"f18b806a-b9ab-4edd-83df-d8298d026d6e\" (UID: \"f18b806a-b9ab-4edd-83df-d8298d026d6e\") " Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.828897 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18b806a-b9ab-4edd-83df-d8298d026d6e-kube-api-access-44dkw" (OuterVolumeSpecName: "kube-api-access-44dkw") pod "f18b806a-b9ab-4edd-83df-d8298d026d6e" (UID: "f18b806a-b9ab-4edd-83df-d8298d026d6e"). InnerVolumeSpecName "kube-api-access-44dkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.850978 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-inventory" (OuterVolumeSpecName: "inventory") pod "f18b806a-b9ab-4edd-83df-d8298d026d6e" (UID: "f18b806a-b9ab-4edd-83df-d8298d026d6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.859418 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f18b806a-b9ab-4edd-83df-d8298d026d6e" (UID: "f18b806a-b9ab-4edd-83df-d8298d026d6e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.924431 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.924500 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44dkw\" (UniqueName: \"kubernetes.io/projected/f18b806a-b9ab-4edd-83df-d8298d026d6e-kube-api-access-44dkw\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:26 crc kubenswrapper[4666]: I1203 12:56:26.924513 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f18b806a-b9ab-4edd-83df-d8298d026d6e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.348128 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" event={"ID":"f18b806a-b9ab-4edd-83df-d8298d026d6e","Type":"ContainerDied","Data":"94e260301aa5e6c2a1fc95a2f9894b165654bb201c833b703dae699ec0cfe921"} Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.348178 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e260301aa5e6c2a1fc95a2f9894b165654bb201c833b703dae699ec0cfe921" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.348239 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.441590 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s"] Dec 03 12:56:27 crc kubenswrapper[4666]: E1203 12:56:27.441997 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18b806a-b9ab-4edd-83df-d8298d026d6e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.442017 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18b806a-b9ab-4edd-83df-d8298d026d6e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.442357 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18b806a-b9ab-4edd-83df-d8298d026d6e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.443118 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.448590 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.448873 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.449076 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.449237 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.468883 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s"] Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.535509 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.535637 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.535685 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knd7x\" (UniqueName: \"kubernetes.io/projected/4621207e-db3e-4ef5-b234-4d9e7443ff87-kube-api-access-knd7x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.637543 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.637645 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.637671 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knd7x\" (UniqueName: \"kubernetes.io/projected/4621207e-db3e-4ef5-b234-4d9e7443ff87-kube-api-access-knd7x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.641756 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.645602 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.659165 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knd7x\" (UniqueName: \"kubernetes.io/projected/4621207e-db3e-4ef5-b234-4d9e7443ff87-kube-api-access-knd7x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xsx2s\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:27 crc kubenswrapper[4666]: I1203 12:56:27.769060 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:56:28 crc kubenswrapper[4666]: I1203 12:56:28.310964 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s"] Dec 03 12:56:28 crc kubenswrapper[4666]: I1203 12:56:28.359520 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" event={"ID":"4621207e-db3e-4ef5-b234-4d9e7443ff87","Type":"ContainerStarted","Data":"35ad23ccb20bf724c6e93db7c30f5650db226c46672e289fdb82aedda4cb4aa6"} Dec 03 12:56:30 crc kubenswrapper[4666]: I1203 12:56:30.377749 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" event={"ID":"4621207e-db3e-4ef5-b234-4d9e7443ff87","Type":"ContainerStarted","Data":"f4fefe55b0034ba41212fb2a433c5a8321977b5d52b2f78d95dc47d4278cdf6c"} Dec 03 12:56:30 crc kubenswrapper[4666]: I1203 12:56:30.396937 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" podStartSLOduration=2.268699194 podStartE2EDuration="3.39691982s" podCreationTimestamp="2025-12-03 12:56:27 +0000 UTC" firstStartedPulling="2025-12-03 12:56:28.31934544 +0000 UTC m=+2577.164306511" lastFinishedPulling="2025-12-03 12:56:29.447566076 +0000 UTC m=+2578.292527137" observedRunningTime="2025-12-03 12:56:30.396408556 +0000 UTC m=+2579.241369607" watchObservedRunningTime="2025-12-03 12:56:30.39691982 +0000 UTC m=+2579.241880871" Dec 03 12:56:34 crc kubenswrapper[4666]: I1203 12:56:34.423365 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:56:34 crc kubenswrapper[4666]: E1203 12:56:34.423951 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:56:45 crc kubenswrapper[4666]: I1203 12:56:45.424545 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:56:45 crc kubenswrapper[4666]: E1203 12:56:45.425188 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:56:48 crc kubenswrapper[4666]: I1203 12:56:48.557064 4666 scope.go:117] "RemoveContainer" containerID="2e75119f3f6d7de144f91ea4e85a9b5a8eacbacefc96ae2614ca9ca884f68951" Dec 03 12:56:48 crc kubenswrapper[4666]: I1203 12:56:48.589200 4666 scope.go:117] "RemoveContainer" containerID="cfe8952f4198519cf95562a9c321f3f7f6e2771b7f5feea2e11a1b7409bfd91b" Dec 03 12:56:48 crc kubenswrapper[4666]: I1203 12:56:48.697506 4666 scope.go:117] "RemoveContainer" containerID="042a9505d607d6be8f5cae0985be852eb4b16770e8e2dce82be6d90c7dcc62ef" Dec 03 12:56:48 crc kubenswrapper[4666]: I1203 12:56:48.751646 4666 scope.go:117] "RemoveContainer" containerID="3773289bcf1b2a4bdd8e616a70153c6c877a5cef9abe269603bed77f84330e0d" Dec 03 12:56:48 crc kubenswrapper[4666]: I1203 12:56:48.832081 4666 scope.go:117] "RemoveContainer" containerID="640915585d72b35a44c9ff49461b6510dae0d6ec5dd2cbc7b9cdfe641a03b825" Dec 03 12:56:56 crc kubenswrapper[4666]: I1203 12:56:56.423761 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:56:56 crc kubenswrapper[4666]: E1203 12:56:56.424856 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.210063 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4pr6q"] Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.212436 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.239139 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pr6q"] Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.328438 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-utilities\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.328503 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-catalog-content\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.329179 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k767h\" (UniqueName: \"kubernetes.io/projected/1b09b859-b95b-448d-9060-806e772410c1-kube-api-access-k767h\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.431187 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-catalog-content\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.431638 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-catalog-content\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.431775 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k767h\" (UniqueName: \"kubernetes.io/projected/1b09b859-b95b-448d-9060-806e772410c1-kube-api-access-k767h\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.432197 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-utilities\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.432468 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-utilities\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.465608 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k767h\" (UniqueName: \"kubernetes.io/projected/1b09b859-b95b-448d-9060-806e772410c1-kube-api-access-k767h\") pod \"redhat-marketplace-4pr6q\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:58 crc kubenswrapper[4666]: I1203 12:56:58.561334 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:56:59 crc kubenswrapper[4666]: I1203 12:56:59.000427 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pr6q"] Dec 03 12:56:59 crc kubenswrapper[4666]: I1203 12:56:59.685803 4666 generic.go:334] "Generic (PLEG): container finished" podID="1b09b859-b95b-448d-9060-806e772410c1" containerID="0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68" exitCode=0 Dec 03 12:56:59 crc kubenswrapper[4666]: I1203 12:56:59.685867 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pr6q" event={"ID":"1b09b859-b95b-448d-9060-806e772410c1","Type":"ContainerDied","Data":"0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68"} Dec 03 12:56:59 crc kubenswrapper[4666]: I1203 12:56:59.686122 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pr6q" event={"ID":"1b09b859-b95b-448d-9060-806e772410c1","Type":"ContainerStarted","Data":"75e826bfbbed639b97a3bd7fd4a887c73cd4dcea91c9b7a5b40204a04c0809e9"} Dec 03 12:57:04 crc kubenswrapper[4666]: I1203 12:57:04.749042 4666 generic.go:334] "Generic (PLEG): container finished" podID="1b09b859-b95b-448d-9060-806e772410c1" containerID="aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022" exitCode=0 Dec 03 12:57:04 crc kubenswrapper[4666]: I1203 12:57:04.749170 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pr6q" event={"ID":"1b09b859-b95b-448d-9060-806e772410c1","Type":"ContainerDied","Data":"aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022"} Dec 03 12:57:05 crc kubenswrapper[4666]: I1203 12:57:05.761980 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pr6q" event={"ID":"1b09b859-b95b-448d-9060-806e772410c1","Type":"ContainerStarted","Data":"558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502"} Dec 03 12:57:05 crc kubenswrapper[4666]: I1203 12:57:05.814606 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4pr6q" podStartSLOduration=2.220408542 podStartE2EDuration="7.814579009s" podCreationTimestamp="2025-12-03 12:56:58 +0000 UTC" firstStartedPulling="2025-12-03 12:56:59.688151961 +0000 UTC m=+2608.533113012" lastFinishedPulling="2025-12-03 12:57:05.282322418 +0000 UTC m=+2614.127283479" observedRunningTime="2025-12-03 12:57:05.779209308 +0000 UTC m=+2614.624170379" watchObservedRunningTime="2025-12-03 12:57:05.814579009 +0000 UTC m=+2614.659540080" Dec 03 12:57:06 crc kubenswrapper[4666]: I1203 12:57:06.774314 4666 generic.go:334] "Generic (PLEG): container finished" podID="4621207e-db3e-4ef5-b234-4d9e7443ff87" containerID="f4fefe55b0034ba41212fb2a433c5a8321977b5d52b2f78d95dc47d4278cdf6c" exitCode=0 Dec 03 12:57:06 crc kubenswrapper[4666]: I1203 12:57:06.774437 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" event={"ID":"4621207e-db3e-4ef5-b234-4d9e7443ff87","Type":"ContainerDied","Data":"f4fefe55b0034ba41212fb2a433c5a8321977b5d52b2f78d95dc47d4278cdf6c"} Dec 03 12:57:07 crc kubenswrapper[4666]: I1203 12:57:07.040978 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7596k"] Dec 03 12:57:07 crc kubenswrapper[4666]: I1203 12:57:07.049852 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7596k"] Dec 03 12:57:07 crc kubenswrapper[4666]: I1203 12:57:07.424252 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:57:07 crc kubenswrapper[4666]: E1203 12:57:07.424555 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:57:07 crc kubenswrapper[4666]: I1203 12:57:07.435927 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f374154c-83df-4685-8224-aa067097648d" path="/var/lib/kubelet/pods/f374154c-83df-4685-8224-aa067097648d/volumes" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.042209 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4f48w"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.050010 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b618-account-create-update-2c7dc"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.077070 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3a3c-account-create-update-4qd86"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.088651 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f103-account-create-update-hscdk"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.096079 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4f48w"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.103970 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3a3c-account-create-update-4qd86"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.110860 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b618-account-create-update-2c7dc"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.119522 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f103-account-create-update-hscdk"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.126431 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-26qkc"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.133237 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-26qkc"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.233325 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.309117 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knd7x\" (UniqueName: \"kubernetes.io/projected/4621207e-db3e-4ef5-b234-4d9e7443ff87-kube-api-access-knd7x\") pod \"4621207e-db3e-4ef5-b234-4d9e7443ff87\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.309196 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-ssh-key\") pod \"4621207e-db3e-4ef5-b234-4d9e7443ff87\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.309249 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-inventory\") pod \"4621207e-db3e-4ef5-b234-4d9e7443ff87\" (UID: \"4621207e-db3e-4ef5-b234-4d9e7443ff87\") " Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.315587 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4621207e-db3e-4ef5-b234-4d9e7443ff87-kube-api-access-knd7x" (OuterVolumeSpecName: "kube-api-access-knd7x") pod "4621207e-db3e-4ef5-b234-4d9e7443ff87" (UID: "4621207e-db3e-4ef5-b234-4d9e7443ff87"). InnerVolumeSpecName "kube-api-access-knd7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.333878 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4621207e-db3e-4ef5-b234-4d9e7443ff87" (UID: "4621207e-db3e-4ef5-b234-4d9e7443ff87"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.344010 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-inventory" (OuterVolumeSpecName: "inventory") pod "4621207e-db3e-4ef5-b234-4d9e7443ff87" (UID: "4621207e-db3e-4ef5-b234-4d9e7443ff87"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.411586 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knd7x\" (UniqueName: \"kubernetes.io/projected/4621207e-db3e-4ef5-b234-4d9e7443ff87-kube-api-access-knd7x\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.411637 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.411655 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4621207e-db3e-4ef5-b234-4d9e7443ff87-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.562450 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.562687 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.660173 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.833575 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.840337 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s" event={"ID":"4621207e-db3e-4ef5-b234-4d9e7443ff87","Type":"ContainerDied","Data":"35ad23ccb20bf724c6e93db7c30f5650db226c46672e289fdb82aedda4cb4aa6"} Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.840385 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35ad23ccb20bf724c6e93db7c30f5650db226c46672e289fdb82aedda4cb4aa6" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.886586 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx"] Dec 03 12:57:08 crc kubenswrapper[4666]: E1203 12:57:08.886929 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4621207e-db3e-4ef5-b234-4d9e7443ff87" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.886952 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4621207e-db3e-4ef5-b234-4d9e7443ff87" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.887172 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4621207e-db3e-4ef5-b234-4d9e7443ff87" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.887731 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.889528 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.889957 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.889955 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.890623 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.901635 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx"] Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.921942 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twvr2\" (UniqueName: \"kubernetes.io/projected/49e41280-1166-4853-b3f9-83436578577b-kube-api-access-twvr2\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.922026 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:08 crc kubenswrapper[4666]: I1203 12:57:08.922059 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.024394 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twvr2\" (UniqueName: \"kubernetes.io/projected/49e41280-1166-4853-b3f9-83436578577b-kube-api-access-twvr2\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.024692 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.024728 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.028837 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.028848 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.041725 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twvr2\" (UniqueName: \"kubernetes.io/projected/49e41280-1166-4853-b3f9-83436578577b-kube-api-access-twvr2\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.216985 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.455205 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502dbfff-6c76-448e-aff9-db535351f22f" path="/var/lib/kubelet/pods/502dbfff-6c76-448e-aff9-db535351f22f/volumes" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.455889 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54662f34-5ced-49d9-bfba-ddccae72099e" path="/var/lib/kubelet/pods/54662f34-5ced-49d9-bfba-ddccae72099e/volumes" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.456528 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98004b4-42cb-4eca-9da3-c440aa955f18" path="/var/lib/kubelet/pods/b98004b4-42cb-4eca-9da3-c440aa955f18/volumes" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.457190 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baaeae1d-3d78-477a-a46d-80ee1c6447b1" path="/var/lib/kubelet/pods/baaeae1d-3d78-477a-a46d-80ee1c6447b1/volumes" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.458342 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d206ccec-3195-42c1-8c07-b27785183fd7" path="/var/lib/kubelet/pods/d206ccec-3195-42c1-8c07-b27785183fd7/volumes" Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.711697 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx"] Dec 03 12:57:09 crc kubenswrapper[4666]: W1203 12:57:09.713974 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49e41280_1166_4853_b3f9_83436578577b.slice/crio-48af2991b674e813a3d060103f60c436f2f4c264379bd12c44bc3a86fcfec949 WatchSource:0}: Error finding container 48af2991b674e813a3d060103f60c436f2f4c264379bd12c44bc3a86fcfec949: Status 404 returned error can't find the container with id 48af2991b674e813a3d060103f60c436f2f4c264379bd12c44bc3a86fcfec949 Dec 03 12:57:09 crc kubenswrapper[4666]: I1203 12:57:09.841176 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" event={"ID":"49e41280-1166-4853-b3f9-83436578577b","Type":"ContainerStarted","Data":"48af2991b674e813a3d060103f60c436f2f4c264379bd12c44bc3a86fcfec949"} Dec 03 12:57:11 crc kubenswrapper[4666]: I1203 12:57:11.860375 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" event={"ID":"49e41280-1166-4853-b3f9-83436578577b","Type":"ContainerStarted","Data":"ddf643fc384b3475726bbebee3e674ff3310fd46caf67465d9e2b5e76caf5f8b"} Dec 03 12:57:11 crc kubenswrapper[4666]: I1203 12:57:11.892779 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" podStartSLOduration=2.989483828 podStartE2EDuration="3.892754562s" podCreationTimestamp="2025-12-03 12:57:08 +0000 UTC" firstStartedPulling="2025-12-03 12:57:09.716340722 +0000 UTC m=+2618.561301773" lastFinishedPulling="2025-12-03 12:57:10.619611426 +0000 UTC m=+2619.464572507" observedRunningTime="2025-12-03 12:57:11.878585441 +0000 UTC m=+2620.723546532" watchObservedRunningTime="2025-12-03 12:57:11.892754562 +0000 UTC m=+2620.737715663" Dec 03 12:57:14 crc kubenswrapper[4666]: I1203 12:57:14.891302 4666 generic.go:334] "Generic (PLEG): container finished" podID="49e41280-1166-4853-b3f9-83436578577b" containerID="ddf643fc384b3475726bbebee3e674ff3310fd46caf67465d9e2b5e76caf5f8b" exitCode=0 Dec 03 12:57:14 crc kubenswrapper[4666]: I1203 12:57:14.891403 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" event={"ID":"49e41280-1166-4853-b3f9-83436578577b","Type":"ContainerDied","Data":"ddf643fc384b3475726bbebee3e674ff3310fd46caf67465d9e2b5e76caf5f8b"} Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.318289 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.388818 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-inventory\") pod \"49e41280-1166-4853-b3f9-83436578577b\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.389165 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-ssh-key\") pod \"49e41280-1166-4853-b3f9-83436578577b\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.389213 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twvr2\" (UniqueName: \"kubernetes.io/projected/49e41280-1166-4853-b3f9-83436578577b-kube-api-access-twvr2\") pod \"49e41280-1166-4853-b3f9-83436578577b\" (UID: \"49e41280-1166-4853-b3f9-83436578577b\") " Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.395163 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e41280-1166-4853-b3f9-83436578577b-kube-api-access-twvr2" (OuterVolumeSpecName: "kube-api-access-twvr2") pod "49e41280-1166-4853-b3f9-83436578577b" (UID: "49e41280-1166-4853-b3f9-83436578577b"). InnerVolumeSpecName "kube-api-access-twvr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.417955 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-inventory" (OuterVolumeSpecName: "inventory") pod "49e41280-1166-4853-b3f9-83436578577b" (UID: "49e41280-1166-4853-b3f9-83436578577b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.439161 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49e41280-1166-4853-b3f9-83436578577b" (UID: "49e41280-1166-4853-b3f9-83436578577b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.491699 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.491742 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49e41280-1166-4853-b3f9-83436578577b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.491764 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twvr2\" (UniqueName: \"kubernetes.io/projected/49e41280-1166-4853-b3f9-83436578577b-kube-api-access-twvr2\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.915353 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" event={"ID":"49e41280-1166-4853-b3f9-83436578577b","Type":"ContainerDied","Data":"48af2991b674e813a3d060103f60c436f2f4c264379bd12c44bc3a86fcfec949"} Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.915407 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48af2991b674e813a3d060103f60c436f2f4c264379bd12c44bc3a86fcfec949" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.915493 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.994972 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv"] Dec 03 12:57:16 crc kubenswrapper[4666]: E1203 12:57:16.995577 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e41280-1166-4853-b3f9-83436578577b" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.995669 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e41280-1166-4853-b3f9-83436578577b" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.996001 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e41280-1166-4853-b3f9-83436578577b" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.996716 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.998627 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.999914 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:16 crc kubenswrapper[4666]: I1203 12:57:16.999957 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.000033 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrln\" (UniqueName: \"kubernetes.io/projected/4425997f-9f2e-437e-8c5b-1754d4f7abac-kube-api-access-mxrln\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.000071 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.000193 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.001488 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.008161 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv"] Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.103626 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.103860 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrln\" (UniqueName: \"kubernetes.io/projected/4425997f-9f2e-437e-8c5b-1754d4f7abac-kube-api-access-mxrln\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.103922 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.109125 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.109310 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.121880 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrln\" (UniqueName: \"kubernetes.io/projected/4425997f-9f2e-437e-8c5b-1754d4f7abac-kube-api-access-mxrln\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.319149 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.925408 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv"] Dec 03 12:57:17 crc kubenswrapper[4666]: I1203 12:57:17.928812 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" event={"ID":"4425997f-9f2e-437e-8c5b-1754d4f7abac","Type":"ContainerStarted","Data":"b3a767e5b86e77cdf7b4c325456127bf2c43abb0953d3658f02582eba41f02b4"} Dec 03 12:57:18 crc kubenswrapper[4666]: I1203 12:57:18.424327 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:57:18 crc kubenswrapper[4666]: E1203 12:57:18.425970 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:57:18 crc kubenswrapper[4666]: I1203 12:57:18.615924 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:57:18 crc kubenswrapper[4666]: I1203 12:57:18.690040 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pr6q"] Dec 03 12:57:18 crc kubenswrapper[4666]: I1203 12:57:18.939712 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4pr6q" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="registry-server" containerID="cri-o://558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502" gracePeriod=2 Dec 03 12:57:18 crc kubenswrapper[4666]: I1203 12:57:18.940258 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" event={"ID":"4425997f-9f2e-437e-8c5b-1754d4f7abac","Type":"ContainerStarted","Data":"8364d2c22d0c906da098c9da4476088fa324d11c4a86514afcb57dcedefe42c5"} Dec 03 12:57:18 crc kubenswrapper[4666]: I1203 12:57:18.980658 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" podStartSLOduration=2.526148622 podStartE2EDuration="2.980641111s" podCreationTimestamp="2025-12-03 12:57:16 +0000 UTC" firstStartedPulling="2025-12-03 12:57:17.920754003 +0000 UTC m=+2626.765715054" lastFinishedPulling="2025-12-03 12:57:18.375246482 +0000 UTC m=+2627.220207543" observedRunningTime="2025-12-03 12:57:18.971859305 +0000 UTC m=+2627.816820346" watchObservedRunningTime="2025-12-03 12:57:18.980641111 +0000 UTC m=+2627.825602162" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.333861 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.348360 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-catalog-content\") pod \"1b09b859-b95b-448d-9060-806e772410c1\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.348498 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-utilities\") pod \"1b09b859-b95b-448d-9060-806e772410c1\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.348562 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k767h\" (UniqueName: \"kubernetes.io/projected/1b09b859-b95b-448d-9060-806e772410c1-kube-api-access-k767h\") pod \"1b09b859-b95b-448d-9060-806e772410c1\" (UID: \"1b09b859-b95b-448d-9060-806e772410c1\") " Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.350881 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-utilities" (OuterVolumeSpecName: "utilities") pod "1b09b859-b95b-448d-9060-806e772410c1" (UID: "1b09b859-b95b-448d-9060-806e772410c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.351323 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.368422 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b09b859-b95b-448d-9060-806e772410c1-kube-api-access-k767h" (OuterVolumeSpecName: "kube-api-access-k767h") pod "1b09b859-b95b-448d-9060-806e772410c1" (UID: "1b09b859-b95b-448d-9060-806e772410c1"). InnerVolumeSpecName "kube-api-access-k767h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.375009 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b09b859-b95b-448d-9060-806e772410c1" (UID: "1b09b859-b95b-448d-9060-806e772410c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.452206 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k767h\" (UniqueName: \"kubernetes.io/projected/1b09b859-b95b-448d-9060-806e772410c1-kube-api-access-k767h\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.452241 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b09b859-b95b-448d-9060-806e772410c1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.948751 4666 generic.go:334] "Generic (PLEG): container finished" podID="1b09b859-b95b-448d-9060-806e772410c1" containerID="558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502" exitCode=0 Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.948807 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pr6q" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.948835 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pr6q" event={"ID":"1b09b859-b95b-448d-9060-806e772410c1","Type":"ContainerDied","Data":"558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502"} Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.949355 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pr6q" event={"ID":"1b09b859-b95b-448d-9060-806e772410c1","Type":"ContainerDied","Data":"75e826bfbbed639b97a3bd7fd4a887c73cd4dcea91c9b7a5b40204a04c0809e9"} Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.949378 4666 scope.go:117] "RemoveContainer" containerID="558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.975618 4666 scope.go:117] "RemoveContainer" containerID="aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022" Dec 03 12:57:19 crc kubenswrapper[4666]: I1203 12:57:19.980708 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pr6q"] Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.007436 4666 scope.go:117] "RemoveContainer" containerID="0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68" Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.007581 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pr6q"] Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.039526 4666 scope.go:117] "RemoveContainer" containerID="558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502" Dec 03 12:57:20 crc kubenswrapper[4666]: E1203 12:57:20.040040 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502\": container with ID starting with 558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502 not found: ID does not exist" containerID="558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502" Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.040068 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502"} err="failed to get container status \"558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502\": rpc error: code = NotFound desc = could not find container \"558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502\": container with ID starting with 558205de3fbccaa7121c309201d4610acb5ca48aa2a250ffcbc263ec6e9ba502 not found: ID does not exist" Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.040102 4666 scope.go:117] "RemoveContainer" containerID="aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022" Dec 03 12:57:20 crc kubenswrapper[4666]: E1203 12:57:20.040352 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022\": container with ID starting with aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022 not found: ID does not exist" containerID="aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022" Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.040377 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022"} err="failed to get container status \"aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022\": rpc error: code = NotFound desc = could not find container \"aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022\": container with ID starting with aca8829969041edcb17f94c51ed8ef73fce0053e015412c6691ebf1ef0cdc022 not found: ID does not exist" Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.040408 4666 scope.go:117] "RemoveContainer" containerID="0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68" Dec 03 12:57:20 crc kubenswrapper[4666]: E1203 12:57:20.040570 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68\": container with ID starting with 0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68 not found: ID does not exist" containerID="0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68" Dec 03 12:57:20 crc kubenswrapper[4666]: I1203 12:57:20.040590 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68"} err="failed to get container status \"0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68\": rpc error: code = NotFound desc = could not find container \"0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68\": container with ID starting with 0d8ab51cdd66dd75349728dcb1320b230825d39b453a0f13a6eee5d984de5e68 not found: ID does not exist" Dec 03 12:57:21 crc kubenswrapper[4666]: I1203 12:57:21.441593 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b09b859-b95b-448d-9060-806e772410c1" path="/var/lib/kubelet/pods/1b09b859-b95b-448d-9060-806e772410c1/volumes" Dec 03 12:57:31 crc kubenswrapper[4666]: I1203 12:57:31.428454 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:57:31 crc kubenswrapper[4666]: E1203 12:57:31.429031 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:57:35 crc kubenswrapper[4666]: I1203 12:57:35.049388 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qr2ph"] Dec 03 12:57:35 crc kubenswrapper[4666]: I1203 12:57:35.060682 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qr2ph"] Dec 03 12:57:35 crc kubenswrapper[4666]: I1203 12:57:35.434714 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7912a6f-d432-4e4c-8c31-0023deac5557" path="/var/lib/kubelet/pods/e7912a6f-d432-4e4c-8c31-0023deac5557/volumes" Dec 03 12:57:46 crc kubenswrapper[4666]: I1203 12:57:46.423620 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:57:46 crc kubenswrapper[4666]: E1203 12:57:46.424420 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:57:48 crc kubenswrapper[4666]: I1203 12:57:48.958216 4666 scope.go:117] "RemoveContainer" containerID="97e1fca69628ee42cca3434b8c8b2ba5795291e2c919a0bcf6e9cc1f1f398d4d" Dec 03 12:57:48 crc kubenswrapper[4666]: I1203 12:57:48.981310 4666 scope.go:117] "RemoveContainer" containerID="09db20aae634ddf485f19ff2fe5abc7e0b1ee634a6e4dabc74af3c08dec48e66" Dec 03 12:57:49 crc kubenswrapper[4666]: I1203 12:57:49.029798 4666 scope.go:117] "RemoveContainer" containerID="a2073092072ec6c89786ae40717353619657ea8ccd9a68f703b28c6092d09428" Dec 03 12:57:49 crc kubenswrapper[4666]: I1203 12:57:49.053741 4666 scope.go:117] "RemoveContainer" containerID="1c8458fec2fe930bd0dd1aaba74ed6fde386701b6b50f247707095572e6f2be1" Dec 03 12:57:49 crc kubenswrapper[4666]: I1203 12:57:49.118351 4666 scope.go:117] "RemoveContainer" containerID="26efb468e7ccf5b0be81244d4e786ca27e41fda98463bb1e29cb129b83cb7464" Dec 03 12:57:49 crc kubenswrapper[4666]: I1203 12:57:49.134866 4666 scope.go:117] "RemoveContainer" containerID="c7b8e36a55f83333562a64d217284ffeda7eaa01718c5f49529c102007ce49a5" Dec 03 12:57:49 crc kubenswrapper[4666]: I1203 12:57:49.171569 4666 scope.go:117] "RemoveContainer" containerID="39f40f2c6efbd7a882478c5e9843c1e6358a05338bfbb88bae7015f53aac9aa9" Dec 03 12:57:53 crc kubenswrapper[4666]: I1203 12:57:53.059437 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5m5fc"] Dec 03 12:57:53 crc kubenswrapper[4666]: I1203 12:57:53.071185 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sts7r"] Dec 03 12:57:53 crc kubenswrapper[4666]: I1203 12:57:53.079288 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5m5fc"] Dec 03 12:57:53 crc kubenswrapper[4666]: I1203 12:57:53.088870 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sts7r"] Dec 03 12:57:53 crc kubenswrapper[4666]: I1203 12:57:53.437414 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b702d3b-190d-48dc-8ee0-531b9d6f712b" path="/var/lib/kubelet/pods/1b702d3b-190d-48dc-8ee0-531b9d6f712b/volumes" Dec 03 12:57:53 crc kubenswrapper[4666]: I1203 12:57:53.438520 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0002d8-cf2c-4ad2-a464-f1cff29a03d7" path="/var/lib/kubelet/pods/8d0002d8-cf2c-4ad2-a464-f1cff29a03d7/volumes" Dec 03 12:58:00 crc kubenswrapper[4666]: I1203 12:58:00.423825 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:58:00 crc kubenswrapper[4666]: E1203 12:58:00.424517 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:58:06 crc kubenswrapper[4666]: I1203 12:58:06.356539 4666 generic.go:334] "Generic (PLEG): container finished" podID="4425997f-9f2e-437e-8c5b-1754d4f7abac" containerID="8364d2c22d0c906da098c9da4476088fa324d11c4a86514afcb57dcedefe42c5" exitCode=0 Dec 03 12:58:06 crc kubenswrapper[4666]: I1203 12:58:06.356645 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" event={"ID":"4425997f-9f2e-437e-8c5b-1754d4f7abac","Type":"ContainerDied","Data":"8364d2c22d0c906da098c9da4476088fa324d11c4a86514afcb57dcedefe42c5"} Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.273287 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.366585 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-ssh-key\") pod \"4425997f-9f2e-437e-8c5b-1754d4f7abac\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.366926 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-inventory\") pod \"4425997f-9f2e-437e-8c5b-1754d4f7abac\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.367005 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxrln\" (UniqueName: \"kubernetes.io/projected/4425997f-9f2e-437e-8c5b-1754d4f7abac-kube-api-access-mxrln\") pod \"4425997f-9f2e-437e-8c5b-1754d4f7abac\" (UID: \"4425997f-9f2e-437e-8c5b-1754d4f7abac\") " Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.373307 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4425997f-9f2e-437e-8c5b-1754d4f7abac-kube-api-access-mxrln" (OuterVolumeSpecName: "kube-api-access-mxrln") pod "4425997f-9f2e-437e-8c5b-1754d4f7abac" (UID: "4425997f-9f2e-437e-8c5b-1754d4f7abac"). InnerVolumeSpecName "kube-api-access-mxrln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.377931 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" event={"ID":"4425997f-9f2e-437e-8c5b-1754d4f7abac","Type":"ContainerDied","Data":"b3a767e5b86e77cdf7b4c325456127bf2c43abb0953d3658f02582eba41f02b4"} Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.377990 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a767e5b86e77cdf7b4c325456127bf2c43abb0953d3658f02582eba41f02b4" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.377997 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.396762 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4425997f-9f2e-437e-8c5b-1754d4f7abac" (UID: "4425997f-9f2e-437e-8c5b-1754d4f7abac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.402409 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-inventory" (OuterVolumeSpecName: "inventory") pod "4425997f-9f2e-437e-8c5b-1754d4f7abac" (UID: "4425997f-9f2e-437e-8c5b-1754d4f7abac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.456517 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hcgrl"] Dec 03 12:58:08 crc kubenswrapper[4666]: E1203 12:58:08.456947 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4425997f-9f2e-437e-8c5b-1754d4f7abac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.456969 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4425997f-9f2e-437e-8c5b-1754d4f7abac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:58:08 crc kubenswrapper[4666]: E1203 12:58:08.456996 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="extract-content" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.457002 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="extract-content" Dec 03 12:58:08 crc kubenswrapper[4666]: E1203 12:58:08.457016 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="registry-server" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.457023 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="registry-server" Dec 03 12:58:08 crc kubenswrapper[4666]: E1203 12:58:08.457035 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="extract-utilities" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.457041 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="extract-utilities" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.457233 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b09b859-b95b-448d-9060-806e772410c1" containerName="registry-server" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.457245 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4425997f-9f2e-437e-8c5b-1754d4f7abac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.457887 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.470501 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.470524 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4425997f-9f2e-437e-8c5b-1754d4f7abac-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.470535 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxrln\" (UniqueName: \"kubernetes.io/projected/4425997f-9f2e-437e-8c5b-1754d4f7abac-kube-api-access-mxrln\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.478470 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hcgrl"] Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.572000 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.572124 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.572148 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhctb\" (UniqueName: \"kubernetes.io/projected/6ba9df91-2576-46bc-ae29-daaf447e1ccf-kube-api-access-qhctb\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.674245 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.674368 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhctb\" (UniqueName: \"kubernetes.io/projected/6ba9df91-2576-46bc-ae29-daaf447e1ccf-kube-api-access-qhctb\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.674394 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.678311 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.678821 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.693235 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhctb\" (UniqueName: \"kubernetes.io/projected/6ba9df91-2576-46bc-ae29-daaf447e1ccf-kube-api-access-qhctb\") pod \"ssh-known-hosts-edpm-deployment-hcgrl\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:08 crc kubenswrapper[4666]: I1203 12:58:08.773416 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:09 crc kubenswrapper[4666]: I1203 12:58:09.353987 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hcgrl"] Dec 03 12:58:09 crc kubenswrapper[4666]: I1203 12:58:09.387984 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" event={"ID":"6ba9df91-2576-46bc-ae29-daaf447e1ccf","Type":"ContainerStarted","Data":"efea5fd26d897b6194ace73029e098b1642603869880ca1c0cef56d8908b8b51"} Dec 03 12:58:10 crc kubenswrapper[4666]: I1203 12:58:10.397459 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" event={"ID":"6ba9df91-2576-46bc-ae29-daaf447e1ccf","Type":"ContainerStarted","Data":"478a52901eab6d8affb9713243a844edfa72cf06b78f712397086b665a997af0"} Dec 03 12:58:10 crc kubenswrapper[4666]: I1203 12:58:10.422196 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" podStartSLOduration=1.978689958 podStartE2EDuration="2.422172929s" podCreationTimestamp="2025-12-03 12:58:08 +0000 UTC" firstStartedPulling="2025-12-03 12:58:09.358710376 +0000 UTC m=+2678.203671427" lastFinishedPulling="2025-12-03 12:58:09.802193327 +0000 UTC m=+2678.647154398" observedRunningTime="2025-12-03 12:58:10.412740425 +0000 UTC m=+2679.257701506" watchObservedRunningTime="2025-12-03 12:58:10.422172929 +0000 UTC m=+2679.267134010" Dec 03 12:58:13 crc kubenswrapper[4666]: I1203 12:58:13.423679 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:58:13 crc kubenswrapper[4666]: E1203 12:58:13.424196 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:58:16 crc kubenswrapper[4666]: I1203 12:58:16.446347 4666 generic.go:334] "Generic (PLEG): container finished" podID="6ba9df91-2576-46bc-ae29-daaf447e1ccf" containerID="478a52901eab6d8affb9713243a844edfa72cf06b78f712397086b665a997af0" exitCode=0 Dec 03 12:58:16 crc kubenswrapper[4666]: I1203 12:58:16.446616 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" event={"ID":"6ba9df91-2576-46bc-ae29-daaf447e1ccf","Type":"ContainerDied","Data":"478a52901eab6d8affb9713243a844edfa72cf06b78f712397086b665a997af0"} Dec 03 12:58:17 crc kubenswrapper[4666]: I1203 12:58:17.978315 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.047787 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhctb\" (UniqueName: \"kubernetes.io/projected/6ba9df91-2576-46bc-ae29-daaf447e1ccf-kube-api-access-qhctb\") pod \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.048081 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-inventory-0\") pod \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.048309 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-ssh-key-openstack-edpm-ipam\") pod \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\" (UID: \"6ba9df91-2576-46bc-ae29-daaf447e1ccf\") " Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.053247 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba9df91-2576-46bc-ae29-daaf447e1ccf-kube-api-access-qhctb" (OuterVolumeSpecName: "kube-api-access-qhctb") pod "6ba9df91-2576-46bc-ae29-daaf447e1ccf" (UID: "6ba9df91-2576-46bc-ae29-daaf447e1ccf"). InnerVolumeSpecName "kube-api-access-qhctb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.078050 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "6ba9df91-2576-46bc-ae29-daaf447e1ccf" (UID: "6ba9df91-2576-46bc-ae29-daaf447e1ccf"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.078872 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6ba9df91-2576-46bc-ae29-daaf447e1ccf" (UID: "6ba9df91-2576-46bc-ae29-daaf447e1ccf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.150337 4666 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.150378 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ba9df91-2576-46bc-ae29-daaf447e1ccf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.150390 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhctb\" (UniqueName: \"kubernetes.io/projected/6ba9df91-2576-46bc-ae29-daaf447e1ccf-kube-api-access-qhctb\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.466515 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" event={"ID":"6ba9df91-2576-46bc-ae29-daaf447e1ccf","Type":"ContainerDied","Data":"efea5fd26d897b6194ace73029e098b1642603869880ca1c0cef56d8908b8b51"} Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.466571 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efea5fd26d897b6194ace73029e098b1642603869880ca1c0cef56d8908b8b51" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.466651 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hcgrl" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.542156 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5"] Dec 03 12:58:18 crc kubenswrapper[4666]: E1203 12:58:18.542598 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba9df91-2576-46bc-ae29-daaf447e1ccf" containerName="ssh-known-hosts-edpm-deployment" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.542616 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba9df91-2576-46bc-ae29-daaf447e1ccf" containerName="ssh-known-hosts-edpm-deployment" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.542784 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba9df91-2576-46bc-ae29-daaf447e1ccf" containerName="ssh-known-hosts-edpm-deployment" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.543440 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.547795 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.547866 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.547981 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.547989 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.550596 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5"] Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.663039 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2725\" (UniqueName: \"kubernetes.io/projected/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-kube-api-access-m2725\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.663229 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.663295 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.765516 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2725\" (UniqueName: \"kubernetes.io/projected/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-kube-api-access-m2725\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.765604 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.765628 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.770856 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.771416 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.784968 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2725\" (UniqueName: \"kubernetes.io/projected/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-kube-api-access-m2725\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-p6kp5\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:18 crc kubenswrapper[4666]: I1203 12:58:18.868548 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:19 crc kubenswrapper[4666]: I1203 12:58:19.373455 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5"] Dec 03 12:58:19 crc kubenswrapper[4666]: I1203 12:58:19.474759 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" event={"ID":"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510","Type":"ContainerStarted","Data":"e8be8ea84d91b1b677048ad713ad2a68058deb1feb620ac0d6f43bb88022292e"} Dec 03 12:58:20 crc kubenswrapper[4666]: I1203 12:58:20.484193 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" event={"ID":"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510","Type":"ContainerStarted","Data":"08c41bee9a85eb9c9121aa899a764e4b287096b4a5e4287a44df9522a708434a"} Dec 03 12:58:20 crc kubenswrapper[4666]: I1203 12:58:20.503379 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" podStartSLOduration=1.821900753 podStartE2EDuration="2.503361188s" podCreationTimestamp="2025-12-03 12:58:18 +0000 UTC" firstStartedPulling="2025-12-03 12:58:19.378958805 +0000 UTC m=+2688.223919856" lastFinishedPulling="2025-12-03 12:58:20.06041924 +0000 UTC m=+2688.905380291" observedRunningTime="2025-12-03 12:58:20.498413214 +0000 UTC m=+2689.343374275" watchObservedRunningTime="2025-12-03 12:58:20.503361188 +0000 UTC m=+2689.348322239" Dec 03 12:58:25 crc kubenswrapper[4666]: I1203 12:58:25.423896 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:58:25 crc kubenswrapper[4666]: E1203 12:58:25.424875 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 12:58:28 crc kubenswrapper[4666]: I1203 12:58:28.554495 4666 generic.go:334] "Generic (PLEG): container finished" podID="c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" containerID="08c41bee9a85eb9c9121aa899a764e4b287096b4a5e4287a44df9522a708434a" exitCode=0 Dec 03 12:58:28 crc kubenswrapper[4666]: I1203 12:58:28.554629 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" event={"ID":"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510","Type":"ContainerDied","Data":"08c41bee9a85eb9c9121aa899a764e4b287096b4a5e4287a44df9522a708434a"} Dec 03 12:58:29 crc kubenswrapper[4666]: I1203 12:58:29.981402 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.109823 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-ssh-key\") pod \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.109996 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2725\" (UniqueName: \"kubernetes.io/projected/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-kube-api-access-m2725\") pod \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.110101 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-inventory\") pod \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\" (UID: \"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510\") " Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.117385 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-kube-api-access-m2725" (OuterVolumeSpecName: "kube-api-access-m2725") pod "c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" (UID: "c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510"). InnerVolumeSpecName "kube-api-access-m2725". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.135921 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" (UID: "c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.143892 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-inventory" (OuterVolumeSpecName: "inventory") pod "c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" (UID: "c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.218285 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2725\" (UniqueName: \"kubernetes.io/projected/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-kube-api-access-m2725\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.218400 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.218812 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.574714 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" event={"ID":"c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510","Type":"ContainerDied","Data":"e8be8ea84d91b1b677048ad713ad2a68058deb1feb620ac0d6f43bb88022292e"} Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.574766 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8be8ea84d91b1b677048ad713ad2a68058deb1feb620ac0d6f43bb88022292e" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.574765 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.639908 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz"] Dec 03 12:58:30 crc kubenswrapper[4666]: E1203 12:58:30.640279 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.640295 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.640503 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.641069 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.642675 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.643897 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.643962 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.644032 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.651728 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz"] Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.829931 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.830314 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s8gr\" (UniqueName: \"kubernetes.io/projected/3ac75587-f1c3-421b-b935-60ef80d9eb99-kube-api-access-7s8gr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.830460 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.931952 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.932015 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.932105 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s8gr\" (UniqueName: \"kubernetes.io/projected/3ac75587-f1c3-421b-b935-60ef80d9eb99-kube-api-access-7s8gr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.937617 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.937654 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.962496 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s8gr\" (UniqueName: \"kubernetes.io/projected/3ac75587-f1c3-421b-b935-60ef80d9eb99-kube-api-access-7s8gr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:30 crc kubenswrapper[4666]: I1203 12:58:30.982332 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:31 crc kubenswrapper[4666]: I1203 12:58:31.515762 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz"] Dec 03 12:58:31 crc kubenswrapper[4666]: I1203 12:58:31.590591 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" event={"ID":"3ac75587-f1c3-421b-b935-60ef80d9eb99","Type":"ContainerStarted","Data":"0ef9b7dec39876c54d5ab3fcea7f9d311c6da63360fad307be8e3a3578521962"} Dec 03 12:58:32 crc kubenswrapper[4666]: I1203 12:58:32.017369 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:58:32 crc kubenswrapper[4666]: I1203 12:58:32.610422 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" event={"ID":"3ac75587-f1c3-421b-b935-60ef80d9eb99","Type":"ContainerStarted","Data":"70ed663aefee3bedecc88814c9e13984ee7de31a1c843b79fb5c13c417638399"} Dec 03 12:58:32 crc kubenswrapper[4666]: I1203 12:58:32.637685 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" podStartSLOduration=2.143467701 podStartE2EDuration="2.637660578s" podCreationTimestamp="2025-12-03 12:58:30 +0000 UTC" firstStartedPulling="2025-12-03 12:58:31.519749179 +0000 UTC m=+2700.364710230" lastFinishedPulling="2025-12-03 12:58:32.013942046 +0000 UTC m=+2700.858903107" observedRunningTime="2025-12-03 12:58:32.627931376 +0000 UTC m=+2701.472892427" watchObservedRunningTime="2025-12-03 12:58:32.637660578 +0000 UTC m=+2701.482621629" Dec 03 12:58:37 crc kubenswrapper[4666]: I1203 12:58:37.069227 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-826qj"] Dec 03 12:58:37 crc kubenswrapper[4666]: I1203 12:58:37.076490 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-826qj"] Dec 03 12:58:37 crc kubenswrapper[4666]: I1203 12:58:37.434179 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49a6a9d-4720-4ff0-995e-612b636b2a92" path="/var/lib/kubelet/pods/d49a6a9d-4720-4ff0-995e-612b636b2a92/volumes" Dec 03 12:58:40 crc kubenswrapper[4666]: I1203 12:58:40.424111 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 12:58:40 crc kubenswrapper[4666]: I1203 12:58:40.682027 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"b547b766ea1db48dd959cb6becbb43e323c6d6a5ccfa1ece53ac12bb61932852"} Dec 03 12:58:41 crc kubenswrapper[4666]: I1203 12:58:41.690847 4666 generic.go:334] "Generic (PLEG): container finished" podID="3ac75587-f1c3-421b-b935-60ef80d9eb99" containerID="70ed663aefee3bedecc88814c9e13984ee7de31a1c843b79fb5c13c417638399" exitCode=0 Dec 03 12:58:41 crc kubenswrapper[4666]: I1203 12:58:41.690947 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" event={"ID":"3ac75587-f1c3-421b-b935-60ef80d9eb99","Type":"ContainerDied","Data":"70ed663aefee3bedecc88814c9e13984ee7de31a1c843b79fb5c13c417638399"} Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.086898 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.149015 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-ssh-key\") pod \"3ac75587-f1c3-421b-b935-60ef80d9eb99\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.149156 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-inventory\") pod \"3ac75587-f1c3-421b-b935-60ef80d9eb99\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.149179 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s8gr\" (UniqueName: \"kubernetes.io/projected/3ac75587-f1c3-421b-b935-60ef80d9eb99-kube-api-access-7s8gr\") pod \"3ac75587-f1c3-421b-b935-60ef80d9eb99\" (UID: \"3ac75587-f1c3-421b-b935-60ef80d9eb99\") " Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.154999 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac75587-f1c3-421b-b935-60ef80d9eb99-kube-api-access-7s8gr" (OuterVolumeSpecName: "kube-api-access-7s8gr") pod "3ac75587-f1c3-421b-b935-60ef80d9eb99" (UID: "3ac75587-f1c3-421b-b935-60ef80d9eb99"). InnerVolumeSpecName "kube-api-access-7s8gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.174178 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ac75587-f1c3-421b-b935-60ef80d9eb99" (UID: "3ac75587-f1c3-421b-b935-60ef80d9eb99"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.174208 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-inventory" (OuterVolumeSpecName: "inventory") pod "3ac75587-f1c3-421b-b935-60ef80d9eb99" (UID: "3ac75587-f1c3-421b-b935-60ef80d9eb99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.251826 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.251902 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ac75587-f1c3-421b-b935-60ef80d9eb99-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.251915 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s8gr\" (UniqueName: \"kubernetes.io/projected/3ac75587-f1c3-421b-b935-60ef80d9eb99-kube-api-access-7s8gr\") on node \"crc\" DevicePath \"\"" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.708830 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" event={"ID":"3ac75587-f1c3-421b-b935-60ef80d9eb99","Type":"ContainerDied","Data":"0ef9b7dec39876c54d5ab3fcea7f9d311c6da63360fad307be8e3a3578521962"} Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.708873 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ef9b7dec39876c54d5ab3fcea7f9d311c6da63360fad307be8e3a3578521962" Dec 03 12:58:43 crc kubenswrapper[4666]: I1203 12:58:43.708937 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz" Dec 03 12:58:49 crc kubenswrapper[4666]: I1203 12:58:49.303505 4666 scope.go:117] "RemoveContainer" containerID="2ac794b5f11600be905c18b71906ddabf9c548c4f251843c9d8d1e93b2b70987" Dec 03 12:58:49 crc kubenswrapper[4666]: I1203 12:58:49.347190 4666 scope.go:117] "RemoveContainer" containerID="33fdfc004fbd6f16adf499e9c395bf05d28f9b38aa0ca16dfbee30e93a2ecc11" Dec 03 12:58:49 crc kubenswrapper[4666]: I1203 12:58:49.381501 4666 scope.go:117] "RemoveContainer" containerID="e664b29f6d0b1942a9718c8c47cbc4cf3fb8e248819376147a5ea8417a67d422" Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.866796 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hr5pf"] Dec 03 12:59:08 crc kubenswrapper[4666]: E1203 12:59:08.867769 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac75587-f1c3-421b-b935-60ef80d9eb99" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.867788 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac75587-f1c3-421b-b935-60ef80d9eb99" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.868019 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac75587-f1c3-421b-b935-60ef80d9eb99" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.869465 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.884040 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hr5pf"] Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.934715 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-catalog-content\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.934778 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46lcv\" (UniqueName: \"kubernetes.io/projected/802599b7-cceb-418a-9cc2-fb6011a56641-kube-api-access-46lcv\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:08 crc kubenswrapper[4666]: I1203 12:59:08.934836 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-utilities\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.036980 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-catalog-content\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.037112 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46lcv\" (UniqueName: \"kubernetes.io/projected/802599b7-cceb-418a-9cc2-fb6011a56641-kube-api-access-46lcv\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.037141 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-utilities\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.037542 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-catalog-content\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.037605 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-utilities\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.058312 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46lcv\" (UniqueName: \"kubernetes.io/projected/802599b7-cceb-418a-9cc2-fb6011a56641-kube-api-access-46lcv\") pod \"community-operators-hr5pf\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.186956 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.728095 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hr5pf"] Dec 03 12:59:09 crc kubenswrapper[4666]: I1203 12:59:09.929156 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hr5pf" event={"ID":"802599b7-cceb-418a-9cc2-fb6011a56641","Type":"ContainerStarted","Data":"19e860b94f48718511dd367341c13895a12637f659a3341eddc79f08e6223219"} Dec 03 12:59:13 crc kubenswrapper[4666]: I1203 12:59:13.966314 4666 generic.go:334] "Generic (PLEG): container finished" podID="802599b7-cceb-418a-9cc2-fb6011a56641" containerID="cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554" exitCode=0 Dec 03 12:59:13 crc kubenswrapper[4666]: I1203 12:59:13.966453 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hr5pf" event={"ID":"802599b7-cceb-418a-9cc2-fb6011a56641","Type":"ContainerDied","Data":"cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554"} Dec 03 12:59:14 crc kubenswrapper[4666]: I1203 12:59:14.982120 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hr5pf" event={"ID":"802599b7-cceb-418a-9cc2-fb6011a56641","Type":"ContainerStarted","Data":"99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4"} Dec 03 12:59:15 crc kubenswrapper[4666]: I1203 12:59:15.992767 4666 generic.go:334] "Generic (PLEG): container finished" podID="802599b7-cceb-418a-9cc2-fb6011a56641" containerID="99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4" exitCode=0 Dec 03 12:59:15 crc kubenswrapper[4666]: I1203 12:59:15.992825 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hr5pf" event={"ID":"802599b7-cceb-418a-9cc2-fb6011a56641","Type":"ContainerDied","Data":"99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4"} Dec 03 12:59:17 crc kubenswrapper[4666]: I1203 12:59:17.002579 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hr5pf" event={"ID":"802599b7-cceb-418a-9cc2-fb6011a56641","Type":"ContainerStarted","Data":"e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c"} Dec 03 12:59:17 crc kubenswrapper[4666]: I1203 12:59:17.025171 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hr5pf" podStartSLOduration=6.548813707 podStartE2EDuration="9.025154048s" podCreationTimestamp="2025-12-03 12:59:08 +0000 UTC" firstStartedPulling="2025-12-03 12:59:13.968570326 +0000 UTC m=+2742.813531377" lastFinishedPulling="2025-12-03 12:59:16.444910677 +0000 UTC m=+2745.289871718" observedRunningTime="2025-12-03 12:59:17.018296064 +0000 UTC m=+2745.863257135" watchObservedRunningTime="2025-12-03 12:59:17.025154048 +0000 UTC m=+2745.870115099" Dec 03 12:59:19 crc kubenswrapper[4666]: I1203 12:59:19.187531 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:19 crc kubenswrapper[4666]: I1203 12:59:19.187833 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:19 crc kubenswrapper[4666]: I1203 12:59:19.247894 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:29 crc kubenswrapper[4666]: I1203 12:59:29.251294 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:29 crc kubenswrapper[4666]: I1203 12:59:29.300658 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hr5pf"] Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.105818 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hr5pf" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="registry-server" containerID="cri-o://e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c" gracePeriod=2 Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.569276 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.675942 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46lcv\" (UniqueName: \"kubernetes.io/projected/802599b7-cceb-418a-9cc2-fb6011a56641-kube-api-access-46lcv\") pod \"802599b7-cceb-418a-9cc2-fb6011a56641\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.676225 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-utilities\") pod \"802599b7-cceb-418a-9cc2-fb6011a56641\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.676360 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-catalog-content\") pod \"802599b7-cceb-418a-9cc2-fb6011a56641\" (UID: \"802599b7-cceb-418a-9cc2-fb6011a56641\") " Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.677324 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-utilities" (OuterVolumeSpecName: "utilities") pod "802599b7-cceb-418a-9cc2-fb6011a56641" (UID: "802599b7-cceb-418a-9cc2-fb6011a56641"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.681193 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802599b7-cceb-418a-9cc2-fb6011a56641-kube-api-access-46lcv" (OuterVolumeSpecName: "kube-api-access-46lcv") pod "802599b7-cceb-418a-9cc2-fb6011a56641" (UID: "802599b7-cceb-418a-9cc2-fb6011a56641"). InnerVolumeSpecName "kube-api-access-46lcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.725710 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "802599b7-cceb-418a-9cc2-fb6011a56641" (UID: "802599b7-cceb-418a-9cc2-fb6011a56641"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.778659 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46lcv\" (UniqueName: \"kubernetes.io/projected/802599b7-cceb-418a-9cc2-fb6011a56641-kube-api-access-46lcv\") on node \"crc\" DevicePath \"\"" Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.778692 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:59:30 crc kubenswrapper[4666]: I1203 12:59:30.778703 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/802599b7-cceb-418a-9cc2-fb6011a56641-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.118781 4666 generic.go:334] "Generic (PLEG): container finished" podID="802599b7-cceb-418a-9cc2-fb6011a56641" containerID="e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c" exitCode=0 Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.118844 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hr5pf" event={"ID":"802599b7-cceb-418a-9cc2-fb6011a56641","Type":"ContainerDied","Data":"e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c"} Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.118880 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hr5pf" event={"ID":"802599b7-cceb-418a-9cc2-fb6011a56641","Type":"ContainerDied","Data":"19e860b94f48718511dd367341c13895a12637f659a3341eddc79f08e6223219"} Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.118906 4666 scope.go:117] "RemoveContainer" containerID="e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.119308 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hr5pf" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.141630 4666 scope.go:117] "RemoveContainer" containerID="99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.158516 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hr5pf"] Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.165576 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hr5pf"] Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.191408 4666 scope.go:117] "RemoveContainer" containerID="cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.206066 4666 scope.go:117] "RemoveContainer" containerID="e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c" Dec 03 12:59:31 crc kubenswrapper[4666]: E1203 12:59:31.206604 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c\": container with ID starting with e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c not found: ID does not exist" containerID="e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.206644 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c"} err="failed to get container status \"e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c\": rpc error: code = NotFound desc = could not find container \"e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c\": container with ID starting with e1754b9a047678542046609f0e2ade21000b1870d2344d6f2ee9b7c1ea20109c not found: ID does not exist" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.206673 4666 scope.go:117] "RemoveContainer" containerID="99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4" Dec 03 12:59:31 crc kubenswrapper[4666]: E1203 12:59:31.206966 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4\": container with ID starting with 99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4 not found: ID does not exist" containerID="99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.207022 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4"} err="failed to get container status \"99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4\": rpc error: code = NotFound desc = could not find container \"99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4\": container with ID starting with 99a9d1065427d72cbc1f13b3ba3297a9d1fc158ebb01b140c131b5febe8140b4 not found: ID does not exist" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.207057 4666 scope.go:117] "RemoveContainer" containerID="cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554" Dec 03 12:59:31 crc kubenswrapper[4666]: E1203 12:59:31.207461 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554\": container with ID starting with cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554 not found: ID does not exist" containerID="cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.207487 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554"} err="failed to get container status \"cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554\": rpc error: code = NotFound desc = could not find container \"cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554\": container with ID starting with cfb5d5a6b039c2e63169a5f8ab6cb399646517d3ba975d8b13ffc0baaab38554 not found: ID does not exist" Dec 03 12:59:31 crc kubenswrapper[4666]: I1203 12:59:31.433862 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" path="/var/lib/kubelet/pods/802599b7-cceb-418a-9cc2-fb6011a56641/volumes" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.144808 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5"] Dec 03 13:00:00 crc kubenswrapper[4666]: E1203 13:00:00.145817 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="extract-content" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.145835 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="extract-content" Dec 03 13:00:00 crc kubenswrapper[4666]: E1203 13:00:00.145884 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="extract-utilities" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.145893 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="extract-utilities" Dec 03 13:00:00 crc kubenswrapper[4666]: E1203 13:00:00.145908 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="registry-server" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.145916 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="registry-server" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.146135 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="802599b7-cceb-418a-9cc2-fb6011a56641" containerName="registry-server" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.146937 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.149285 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.149591 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.160144 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5"] Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.243374 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870f863c-0bf4-437b-9c21-90e68cea84de-secret-volume\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.243482 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870f863c-0bf4-437b-9c21-90e68cea84de-config-volume\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.243674 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c72gm\" (UniqueName: \"kubernetes.io/projected/870f863c-0bf4-437b-9c21-90e68cea84de-kube-api-access-c72gm\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.345649 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c72gm\" (UniqueName: \"kubernetes.io/projected/870f863c-0bf4-437b-9c21-90e68cea84de-kube-api-access-c72gm\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.345747 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870f863c-0bf4-437b-9c21-90e68cea84de-secret-volume\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.345799 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870f863c-0bf4-437b-9c21-90e68cea84de-config-volume\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.346814 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870f863c-0bf4-437b-9c21-90e68cea84de-config-volume\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.364253 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870f863c-0bf4-437b-9c21-90e68cea84de-secret-volume\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.367608 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c72gm\" (UniqueName: \"kubernetes.io/projected/870f863c-0bf4-437b-9c21-90e68cea84de-kube-api-access-c72gm\") pod \"collect-profiles-29412780-bjwt5\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.479375 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:00 crc kubenswrapper[4666]: I1203 13:00:00.938428 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5"] Dec 03 13:00:01 crc kubenswrapper[4666]: I1203 13:00:01.444008 4666 generic.go:334] "Generic (PLEG): container finished" podID="870f863c-0bf4-437b-9c21-90e68cea84de" containerID="d23087e2b27dfb7d67f23c16864bdd8a4ab38019f28f75d3cf8402f85a495845" exitCode=0 Dec 03 13:00:01 crc kubenswrapper[4666]: I1203 13:00:01.444269 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" event={"ID":"870f863c-0bf4-437b-9c21-90e68cea84de","Type":"ContainerDied","Data":"d23087e2b27dfb7d67f23c16864bdd8a4ab38019f28f75d3cf8402f85a495845"} Dec 03 13:00:01 crc kubenswrapper[4666]: I1203 13:00:01.444295 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" event={"ID":"870f863c-0bf4-437b-9c21-90e68cea84de","Type":"ContainerStarted","Data":"7007b8f10a31e30855508e4a9139d6386c4dcfbe2bca0e447829d674db0c520f"} Dec 03 13:00:02 crc kubenswrapper[4666]: I1203 13:00:02.854277 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.005147 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870f863c-0bf4-437b-9c21-90e68cea84de-config-volume\") pod \"870f863c-0bf4-437b-9c21-90e68cea84de\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.005298 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c72gm\" (UniqueName: \"kubernetes.io/projected/870f863c-0bf4-437b-9c21-90e68cea84de-kube-api-access-c72gm\") pod \"870f863c-0bf4-437b-9c21-90e68cea84de\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.005508 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870f863c-0bf4-437b-9c21-90e68cea84de-secret-volume\") pod \"870f863c-0bf4-437b-9c21-90e68cea84de\" (UID: \"870f863c-0bf4-437b-9c21-90e68cea84de\") " Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.006040 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870f863c-0bf4-437b-9c21-90e68cea84de-config-volume" (OuterVolumeSpecName: "config-volume") pod "870f863c-0bf4-437b-9c21-90e68cea84de" (UID: "870f863c-0bf4-437b-9c21-90e68cea84de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.011463 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870f863c-0bf4-437b-9c21-90e68cea84de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "870f863c-0bf4-437b-9c21-90e68cea84de" (UID: "870f863c-0bf4-437b-9c21-90e68cea84de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.011592 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870f863c-0bf4-437b-9c21-90e68cea84de-kube-api-access-c72gm" (OuterVolumeSpecName: "kube-api-access-c72gm") pod "870f863c-0bf4-437b-9c21-90e68cea84de" (UID: "870f863c-0bf4-437b-9c21-90e68cea84de"). InnerVolumeSpecName "kube-api-access-c72gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.107574 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c72gm\" (UniqueName: \"kubernetes.io/projected/870f863c-0bf4-437b-9c21-90e68cea84de-kube-api-access-c72gm\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.107611 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/870f863c-0bf4-437b-9c21-90e68cea84de-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.107624 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/870f863c-0bf4-437b-9c21-90e68cea84de-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.465257 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" event={"ID":"870f863c-0bf4-437b-9c21-90e68cea84de","Type":"ContainerDied","Data":"7007b8f10a31e30855508e4a9139d6386c4dcfbe2bca0e447829d674db0c520f"} Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.465299 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7007b8f10a31e30855508e4a9139d6386c4dcfbe2bca0e447829d674db0c520f" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.465300 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5" Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.940421 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv"] Dec 03 13:00:03 crc kubenswrapper[4666]: I1203 13:00:03.950423 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-vnxtv"] Dec 03 13:00:05 crc kubenswrapper[4666]: I1203 13:00:05.433552 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="029e08a8-b4d9-470b-9a9e-364f1a52fd2f" path="/var/lib/kubelet/pods/029e08a8-b4d9-470b-9a9e-364f1a52fd2f/volumes" Dec 03 13:00:26 crc kubenswrapper[4666]: I1203 13:00:26.957800 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grxpk"] Dec 03 13:00:26 crc kubenswrapper[4666]: E1203 13:00:26.958947 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870f863c-0bf4-437b-9c21-90e68cea84de" containerName="collect-profiles" Dec 03 13:00:26 crc kubenswrapper[4666]: I1203 13:00:26.958964 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="870f863c-0bf4-437b-9c21-90e68cea84de" containerName="collect-profiles" Dec 03 13:00:26 crc kubenswrapper[4666]: I1203 13:00:26.959232 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="870f863c-0bf4-437b-9c21-90e68cea84de" containerName="collect-profiles" Dec 03 13:00:26 crc kubenswrapper[4666]: I1203 13:00:26.960782 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:26 crc kubenswrapper[4666]: I1203 13:00:26.971117 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grxpk"] Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.087395 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s648t\" (UniqueName: \"kubernetes.io/projected/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-kube-api-access-s648t\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.087511 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-utilities\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.087555 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-catalog-content\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.190198 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s648t\" (UniqueName: \"kubernetes.io/projected/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-kube-api-access-s648t\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.190345 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-utilities\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.190400 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-catalog-content\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.190928 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-catalog-content\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.190943 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-utilities\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.213168 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s648t\" (UniqueName: \"kubernetes.io/projected/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-kube-api-access-s648t\") pod \"redhat-operators-grxpk\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.288198 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:27 crc kubenswrapper[4666]: I1203 13:00:27.741886 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grxpk"] Dec 03 13:00:28 crc kubenswrapper[4666]: I1203 13:00:28.688195 4666 generic.go:334] "Generic (PLEG): container finished" podID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerID="8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63" exitCode=0 Dec 03 13:00:28 crc kubenswrapper[4666]: I1203 13:00:28.688295 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grxpk" event={"ID":"146c98d4-7d43-4220-b4b1-3c5715ac6b6a","Type":"ContainerDied","Data":"8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63"} Dec 03 13:00:28 crc kubenswrapper[4666]: I1203 13:00:28.690646 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grxpk" event={"ID":"146c98d4-7d43-4220-b4b1-3c5715ac6b6a","Type":"ContainerStarted","Data":"18c8fe9edac38f4891dd61aa1e89f7cac3a5cc44bc8b6acd073cbc699f60b46b"} Dec 03 13:00:28 crc kubenswrapper[4666]: I1203 13:00:28.690080 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:00:31 crc kubenswrapper[4666]: I1203 13:00:31.719779 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grxpk" event={"ID":"146c98d4-7d43-4220-b4b1-3c5715ac6b6a","Type":"ContainerStarted","Data":"d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285"} Dec 03 13:00:34 crc kubenswrapper[4666]: I1203 13:00:34.750335 4666 generic.go:334] "Generic (PLEG): container finished" podID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerID="d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285" exitCode=0 Dec 03 13:00:34 crc kubenswrapper[4666]: I1203 13:00:34.750405 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grxpk" event={"ID":"146c98d4-7d43-4220-b4b1-3c5715ac6b6a","Type":"ContainerDied","Data":"d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285"} Dec 03 13:00:36 crc kubenswrapper[4666]: I1203 13:00:36.767462 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grxpk" event={"ID":"146c98d4-7d43-4220-b4b1-3c5715ac6b6a","Type":"ContainerStarted","Data":"d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1"} Dec 03 13:00:36 crc kubenswrapper[4666]: I1203 13:00:36.784753 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grxpk" podStartSLOduration=3.691512281 podStartE2EDuration="10.784735557s" podCreationTimestamp="2025-12-03 13:00:26 +0000 UTC" firstStartedPulling="2025-12-03 13:00:28.689829739 +0000 UTC m=+2817.534790790" lastFinishedPulling="2025-12-03 13:00:35.783053015 +0000 UTC m=+2824.628014066" observedRunningTime="2025-12-03 13:00:36.781291605 +0000 UTC m=+2825.626252666" watchObservedRunningTime="2025-12-03 13:00:36.784735557 +0000 UTC m=+2825.629696608" Dec 03 13:00:37 crc kubenswrapper[4666]: I1203 13:00:37.288455 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:37 crc kubenswrapper[4666]: I1203 13:00:37.288781 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:38 crc kubenswrapper[4666]: I1203 13:00:38.341616 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grxpk" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="registry-server" probeResult="failure" output=< Dec 03 13:00:38 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 13:00:38 crc kubenswrapper[4666]: > Dec 03 13:00:40 crc kubenswrapper[4666]: I1203 13:00:40.881658 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s7rkg"] Dec 03 13:00:40 crc kubenswrapper[4666]: I1203 13:00:40.884161 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:40 crc kubenswrapper[4666]: I1203 13:00:40.896863 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7rkg"] Dec 03 13:00:40 crc kubenswrapper[4666]: I1203 13:00:40.964850 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-catalog-content\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:40 crc kubenswrapper[4666]: I1203 13:00:40.965197 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-utilities\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:40 crc kubenswrapper[4666]: I1203 13:00:40.965222 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zbg6\" (UniqueName: \"kubernetes.io/projected/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-kube-api-access-4zbg6\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.067154 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-catalog-content\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.067229 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-utilities\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.067256 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zbg6\" (UniqueName: \"kubernetes.io/projected/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-kube-api-access-4zbg6\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.067790 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-catalog-content\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.068207 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-utilities\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.091929 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zbg6\" (UniqueName: \"kubernetes.io/projected/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-kube-api-access-4zbg6\") pod \"certified-operators-s7rkg\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.202777 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.713149 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7rkg"] Dec 03 13:00:41 crc kubenswrapper[4666]: W1203 13:00:41.719982 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c2ec97_7692_4f1f_93b6_7a0f01bb6db1.slice/crio-fd7a813cf08e5c44969a42ca92c2bb27920eb2ad274e670006e51205e878e446 WatchSource:0}: Error finding container fd7a813cf08e5c44969a42ca92c2bb27920eb2ad274e670006e51205e878e446: Status 404 returned error can't find the container with id fd7a813cf08e5c44969a42ca92c2bb27920eb2ad274e670006e51205e878e446 Dec 03 13:00:41 crc kubenswrapper[4666]: I1203 13:00:41.819035 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7rkg" event={"ID":"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1","Type":"ContainerStarted","Data":"fd7a813cf08e5c44969a42ca92c2bb27920eb2ad274e670006e51205e878e446"} Dec 03 13:00:42 crc kubenswrapper[4666]: I1203 13:00:42.828710 4666 generic.go:334] "Generic (PLEG): container finished" podID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerID="8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2" exitCode=0 Dec 03 13:00:42 crc kubenswrapper[4666]: I1203 13:00:42.828762 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7rkg" event={"ID":"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1","Type":"ContainerDied","Data":"8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2"} Dec 03 13:00:45 crc kubenswrapper[4666]: I1203 13:00:45.853660 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7rkg" event={"ID":"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1","Type":"ContainerStarted","Data":"f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8"} Dec 03 13:00:46 crc kubenswrapper[4666]: I1203 13:00:46.864633 4666 generic.go:334] "Generic (PLEG): container finished" podID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerID="f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8" exitCode=0 Dec 03 13:00:46 crc kubenswrapper[4666]: I1203 13:00:46.864712 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7rkg" event={"ID":"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1","Type":"ContainerDied","Data":"f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8"} Dec 03 13:00:47 crc kubenswrapper[4666]: I1203 13:00:47.350218 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:47 crc kubenswrapper[4666]: I1203 13:00:47.405275 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:47 crc kubenswrapper[4666]: I1203 13:00:47.875604 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7rkg" event={"ID":"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1","Type":"ContainerStarted","Data":"386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a"} Dec 03 13:00:47 crc kubenswrapper[4666]: I1203 13:00:47.895942 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s7rkg" podStartSLOduration=3.427486014 podStartE2EDuration="7.895921617s" podCreationTimestamp="2025-12-03 13:00:40 +0000 UTC" firstStartedPulling="2025-12-03 13:00:42.830818662 +0000 UTC m=+2831.675779713" lastFinishedPulling="2025-12-03 13:00:47.299254265 +0000 UTC m=+2836.144215316" observedRunningTime="2025-12-03 13:00:47.892750072 +0000 UTC m=+2836.737711123" watchObservedRunningTime="2025-12-03 13:00:47.895921617 +0000 UTC m=+2836.740882678" Dec 03 13:00:48 crc kubenswrapper[4666]: I1203 13:00:48.871728 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grxpk"] Dec 03 13:00:48 crc kubenswrapper[4666]: I1203 13:00:48.883260 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grxpk" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="registry-server" containerID="cri-o://d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1" gracePeriod=2 Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.530634 4666 scope.go:117] "RemoveContainer" containerID="efcc894c7944ffaa899f2f883b007fa6e92e6b111af861c688aef827db16fcc9" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.815215 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.849718 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-utilities\") pod \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.850019 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s648t\" (UniqueName: \"kubernetes.io/projected/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-kube-api-access-s648t\") pod \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.850129 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-catalog-content\") pod \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\" (UID: \"146c98d4-7d43-4220-b4b1-3c5715ac6b6a\") " Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.854005 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-utilities" (OuterVolumeSpecName: "utilities") pod "146c98d4-7d43-4220-b4b1-3c5715ac6b6a" (UID: "146c98d4-7d43-4220-b4b1-3c5715ac6b6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.860271 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-kube-api-access-s648t" (OuterVolumeSpecName: "kube-api-access-s648t") pod "146c98d4-7d43-4220-b4b1-3c5715ac6b6a" (UID: "146c98d4-7d43-4220-b4b1-3c5715ac6b6a"). InnerVolumeSpecName "kube-api-access-s648t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.895666 4666 generic.go:334] "Generic (PLEG): container finished" podID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerID="d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1" exitCode=0 Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.895716 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grxpk" event={"ID":"146c98d4-7d43-4220-b4b1-3c5715ac6b6a","Type":"ContainerDied","Data":"d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1"} Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.895754 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grxpk" event={"ID":"146c98d4-7d43-4220-b4b1-3c5715ac6b6a","Type":"ContainerDied","Data":"18c8fe9edac38f4891dd61aa1e89f7cac3a5cc44bc8b6acd073cbc699f60b46b"} Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.895769 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grxpk" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.895779 4666 scope.go:117] "RemoveContainer" containerID="d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.916838 4666 scope.go:117] "RemoveContainer" containerID="d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.934130 4666 scope.go:117] "RemoveContainer" containerID="8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.952119 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s648t\" (UniqueName: \"kubernetes.io/projected/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-kube-api-access-s648t\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.952162 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.953098 4666 scope.go:117] "RemoveContainer" containerID="d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1" Dec 03 13:00:49 crc kubenswrapper[4666]: E1203 13:00:49.953452 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1\": container with ID starting with d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1 not found: ID does not exist" containerID="d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.953529 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1"} err="failed to get container status \"d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1\": rpc error: code = NotFound desc = could not find container \"d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1\": container with ID starting with d9262930371b22e732a564025bf78b0fb6cb847491796611353d34ec4810abd1 not found: ID does not exist" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.953550 4666 scope.go:117] "RemoveContainer" containerID="d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285" Dec 03 13:00:49 crc kubenswrapper[4666]: E1203 13:00:49.953792 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285\": container with ID starting with d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285 not found: ID does not exist" containerID="d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.953848 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285"} err="failed to get container status \"d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285\": rpc error: code = NotFound desc = could not find container \"d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285\": container with ID starting with d4518f323db8234718ee974be50ec158ce17fbce53940eddd72120c55ff0a285 not found: ID does not exist" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.953867 4666 scope.go:117] "RemoveContainer" containerID="8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63" Dec 03 13:00:49 crc kubenswrapper[4666]: E1203 13:00:49.954105 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63\": container with ID starting with 8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63 not found: ID does not exist" containerID="8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.954133 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63"} err="failed to get container status \"8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63\": rpc error: code = NotFound desc = could not find container \"8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63\": container with ID starting with 8d911ecbd5c426a8f8f466990d19664f32c802cc5a7cbd270375ce0f337a0a63 not found: ID does not exist" Dec 03 13:00:49 crc kubenswrapper[4666]: I1203 13:00:49.973043 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "146c98d4-7d43-4220-b4b1-3c5715ac6b6a" (UID: "146c98d4-7d43-4220-b4b1-3c5715ac6b6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:00:50 crc kubenswrapper[4666]: I1203 13:00:50.053321 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c98d4-7d43-4220-b4b1-3c5715ac6b6a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:00:50 crc kubenswrapper[4666]: I1203 13:00:50.232757 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grxpk"] Dec 03 13:00:50 crc kubenswrapper[4666]: I1203 13:00:50.240995 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grxpk"] Dec 03 13:00:51 crc kubenswrapper[4666]: I1203 13:00:51.203833 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:51 crc kubenswrapper[4666]: I1203 13:00:51.204180 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:51 crc kubenswrapper[4666]: I1203 13:00:51.253399 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:00:51 crc kubenswrapper[4666]: I1203 13:00:51.436367 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" path="/var/lib/kubelet/pods/146c98d4-7d43-4220-b4b1-3c5715ac6b6a/volumes" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.152267 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412781-4pd56"] Dec 03 13:01:00 crc kubenswrapper[4666]: E1203 13:01:00.153302 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="extract-utilities" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.153321 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="extract-utilities" Dec 03 13:01:00 crc kubenswrapper[4666]: E1203 13:01:00.153357 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="extract-content" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.153366 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="extract-content" Dec 03 13:01:00 crc kubenswrapper[4666]: E1203 13:01:00.153390 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.153398 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.153627 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="146c98d4-7d43-4220-b4b1-3c5715ac6b6a" containerName="registry-server" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.154463 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.163386 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412781-4pd56"] Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.357245 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-combined-ca-bundle\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.358070 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-fernet-keys\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.358119 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-config-data\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.358259 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ccm\" (UniqueName: \"kubernetes.io/projected/867a9f13-2579-4e34-9c29-97847041400d-kube-api-access-98ccm\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.459724 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98ccm\" (UniqueName: \"kubernetes.io/projected/867a9f13-2579-4e34-9c29-97847041400d-kube-api-access-98ccm\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.459823 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-combined-ca-bundle\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.459898 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-fernet-keys\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.459952 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-config-data\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.467534 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-fernet-keys\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.468011 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-config-data\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.468057 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-combined-ca-bundle\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.480428 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ccm\" (UniqueName: \"kubernetes.io/projected/867a9f13-2579-4e34-9c29-97847041400d-kube-api-access-98ccm\") pod \"keystone-cron-29412781-4pd56\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:00 crc kubenswrapper[4666]: I1203 13:01:00.770232 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:01 crc kubenswrapper[4666]: I1203 13:01:01.188161 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412781-4pd56"] Dec 03 13:01:01 crc kubenswrapper[4666]: W1203 13:01:01.189554 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod867a9f13_2579_4e34_9c29_97847041400d.slice/crio-62957fab8df1954a43953cb6624eb1a381a0a5af8f1c244319c3bc5a29454082 WatchSource:0}: Error finding container 62957fab8df1954a43953cb6624eb1a381a0a5af8f1c244319c3bc5a29454082: Status 404 returned error can't find the container with id 62957fab8df1954a43953cb6624eb1a381a0a5af8f1c244319c3bc5a29454082 Dec 03 13:01:01 crc kubenswrapper[4666]: I1203 13:01:01.280543 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:01:01 crc kubenswrapper[4666]: I1203 13:01:01.347016 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s7rkg"] Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.003794 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412781-4pd56" event={"ID":"867a9f13-2579-4e34-9c29-97847041400d","Type":"ContainerStarted","Data":"00b5574e459fa1a5424e7367e329e0f7f54818a7fcadbd4724bb4b1dd67728f3"} Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.003861 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412781-4pd56" event={"ID":"867a9f13-2579-4e34-9c29-97847041400d","Type":"ContainerStarted","Data":"62957fab8df1954a43953cb6624eb1a381a0a5af8f1c244319c3bc5a29454082"} Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.004515 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s7rkg" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="registry-server" containerID="cri-o://386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a" gracePeriod=2 Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.033262 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412781-4pd56" podStartSLOduration=2.033241871 podStartE2EDuration="2.033241871s" podCreationTimestamp="2025-12-03 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:01:02.029716017 +0000 UTC m=+2850.874677058" watchObservedRunningTime="2025-12-03 13:01:02.033241871 +0000 UTC m=+2850.878202922" Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.553207 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.603501 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zbg6\" (UniqueName: \"kubernetes.io/projected/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-kube-api-access-4zbg6\") pod \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.603609 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-catalog-content\") pod \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.603765 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-utilities\") pod \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\" (UID: \"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1\") " Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.608503 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-kube-api-access-4zbg6" (OuterVolumeSpecName: "kube-api-access-4zbg6") pod "66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" (UID: "66c2ec97-7692-4f1f-93b6-7a0f01bb6db1"). InnerVolumeSpecName "kube-api-access-4zbg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.609040 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-utilities" (OuterVolumeSpecName: "utilities") pod "66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" (UID: "66c2ec97-7692-4f1f-93b6-7a0f01bb6db1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.672349 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" (UID: "66c2ec97-7692-4f1f-93b6-7a0f01bb6db1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.707536 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zbg6\" (UniqueName: \"kubernetes.io/projected/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-kube-api-access-4zbg6\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.707619 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:02 crc kubenswrapper[4666]: I1203 13:01:02.707635 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.014915 4666 generic.go:334] "Generic (PLEG): container finished" podID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerID="386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a" exitCode=0 Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.015535 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7rkg" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.018164 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7rkg" event={"ID":"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1","Type":"ContainerDied","Data":"386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a"} Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.018199 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7rkg" event={"ID":"66c2ec97-7692-4f1f-93b6-7a0f01bb6db1","Type":"ContainerDied","Data":"fd7a813cf08e5c44969a42ca92c2bb27920eb2ad274e670006e51205e878e446"} Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.018217 4666 scope.go:117] "RemoveContainer" containerID="386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.038986 4666 scope.go:117] "RemoveContainer" containerID="f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.049513 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s7rkg"] Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.057646 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s7rkg"] Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.081867 4666 scope.go:117] "RemoveContainer" containerID="8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.119883 4666 scope.go:117] "RemoveContainer" containerID="386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a" Dec 03 13:01:03 crc kubenswrapper[4666]: E1203 13:01:03.125509 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a\": container with ID starting with 386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a not found: ID does not exist" containerID="386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.125576 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a"} err="failed to get container status \"386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a\": rpc error: code = NotFound desc = could not find container \"386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a\": container with ID starting with 386c95de7f73655fde62b3b0660b05eb3e9031c2eab8ddafff8fa75c2e45ac3a not found: ID does not exist" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.125605 4666 scope.go:117] "RemoveContainer" containerID="f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8" Dec 03 13:01:03 crc kubenswrapper[4666]: E1203 13:01:03.125986 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8\": container with ID starting with f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8 not found: ID does not exist" containerID="f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.126032 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8"} err="failed to get container status \"f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8\": rpc error: code = NotFound desc = could not find container \"f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8\": container with ID starting with f2dc1595b9660cab600ab4e5238a2e4565c5d8e9f338461fd0ee39cff8c1ccc8 not found: ID does not exist" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.126059 4666 scope.go:117] "RemoveContainer" containerID="8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2" Dec 03 13:01:03 crc kubenswrapper[4666]: E1203 13:01:03.126434 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2\": container with ID starting with 8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2 not found: ID does not exist" containerID="8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.126489 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2"} err="failed to get container status \"8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2\": rpc error: code = NotFound desc = could not find container \"8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2\": container with ID starting with 8e52aa1ea3966feb53b57057fe849c24be5b0d41c9cefa9a624e9188d3673fc2 not found: ID does not exist" Dec 03 13:01:03 crc kubenswrapper[4666]: I1203 13:01:03.433602 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" path="/var/lib/kubelet/pods/66c2ec97-7692-4f1f-93b6-7a0f01bb6db1/volumes" Dec 03 13:01:04 crc kubenswrapper[4666]: I1203 13:01:04.026201 4666 generic.go:334] "Generic (PLEG): container finished" podID="867a9f13-2579-4e34-9c29-97847041400d" containerID="00b5574e459fa1a5424e7367e329e0f7f54818a7fcadbd4724bb4b1dd67728f3" exitCode=0 Dec 03 13:01:04 crc kubenswrapper[4666]: I1203 13:01:04.026286 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412781-4pd56" event={"ID":"867a9f13-2579-4e34-9c29-97847041400d","Type":"ContainerDied","Data":"00b5574e459fa1a5424e7367e329e0f7f54818a7fcadbd4724bb4b1dd67728f3"} Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.369925 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.555677 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98ccm\" (UniqueName: \"kubernetes.io/projected/867a9f13-2579-4e34-9c29-97847041400d-kube-api-access-98ccm\") pod \"867a9f13-2579-4e34-9c29-97847041400d\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.555894 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-fernet-keys\") pod \"867a9f13-2579-4e34-9c29-97847041400d\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.555939 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-combined-ca-bundle\") pod \"867a9f13-2579-4e34-9c29-97847041400d\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.555959 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-config-data\") pod \"867a9f13-2579-4e34-9c29-97847041400d\" (UID: \"867a9f13-2579-4e34-9c29-97847041400d\") " Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.562125 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "867a9f13-2579-4e34-9c29-97847041400d" (UID: "867a9f13-2579-4e34-9c29-97847041400d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.562403 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867a9f13-2579-4e34-9c29-97847041400d-kube-api-access-98ccm" (OuterVolumeSpecName: "kube-api-access-98ccm") pod "867a9f13-2579-4e34-9c29-97847041400d" (UID: "867a9f13-2579-4e34-9c29-97847041400d"). InnerVolumeSpecName "kube-api-access-98ccm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.593721 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "867a9f13-2579-4e34-9c29-97847041400d" (UID: "867a9f13-2579-4e34-9c29-97847041400d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.604729 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-config-data" (OuterVolumeSpecName: "config-data") pod "867a9f13-2579-4e34-9c29-97847041400d" (UID: "867a9f13-2579-4e34-9c29-97847041400d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.658044 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.658099 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.658112 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98ccm\" (UniqueName: \"kubernetes.io/projected/867a9f13-2579-4e34-9c29-97847041400d-kube-api-access-98ccm\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:05 crc kubenswrapper[4666]: I1203 13:01:05.658126 4666 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/867a9f13-2579-4e34-9c29-97847041400d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 13:01:06 crc kubenswrapper[4666]: I1203 13:01:06.047302 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412781-4pd56" event={"ID":"867a9f13-2579-4e34-9c29-97847041400d","Type":"ContainerDied","Data":"62957fab8df1954a43953cb6624eb1a381a0a5af8f1c244319c3bc5a29454082"} Dec 03 13:01:06 crc kubenswrapper[4666]: I1203 13:01:06.047625 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62957fab8df1954a43953cb6624eb1a381a0a5af8f1c244319c3bc5a29454082" Dec 03 13:01:06 crc kubenswrapper[4666]: I1203 13:01:06.047512 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412781-4pd56" Dec 03 13:01:09 crc kubenswrapper[4666]: I1203 13:01:09.866530 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:01:09 crc kubenswrapper[4666]: I1203 13:01:09.867415 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:01:39 crc kubenswrapper[4666]: I1203 13:01:39.866422 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:01:39 crc kubenswrapper[4666]: I1203 13:01:39.867021 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:02:09 crc kubenswrapper[4666]: I1203 13:02:09.866160 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:02:09 crc kubenswrapper[4666]: I1203 13:02:09.866902 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:02:09 crc kubenswrapper[4666]: I1203 13:02:09.866974 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:02:09 crc kubenswrapper[4666]: I1203 13:02:09.868295 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b547b766ea1db48dd959cb6becbb43e323c6d6a5ccfa1ece53ac12bb61932852"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:02:09 crc kubenswrapper[4666]: I1203 13:02:09.868393 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://b547b766ea1db48dd959cb6becbb43e323c6d6a5ccfa1ece53ac12bb61932852" gracePeriod=600 Dec 03 13:02:10 crc kubenswrapper[4666]: I1203 13:02:10.617476 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="b547b766ea1db48dd959cb6becbb43e323c6d6a5ccfa1ece53ac12bb61932852" exitCode=0 Dec 03 13:02:10 crc kubenswrapper[4666]: I1203 13:02:10.617666 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"b547b766ea1db48dd959cb6becbb43e323c6d6a5ccfa1ece53ac12bb61932852"} Dec 03 13:02:10 crc kubenswrapper[4666]: I1203 13:02:10.617972 4666 scope.go:117] "RemoveContainer" containerID="176970ef3488d0411d6194c3d53f23859eb22c18c684bb48f7d429ba32ad6878" Dec 03 13:02:11 crc kubenswrapper[4666]: I1203 13:02:11.626814 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd"} Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.576161 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.585909 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.595750 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xc5pv"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.604133 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.611301 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.618309 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hcgrl"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.625141 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.631838 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-p6kp5"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.639421 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-txrd4"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.646434 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bjshv"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.654640 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hcgrl"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.662358 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.671634 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.679682 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xsx2s"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.688058 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.695231 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-mrnzx"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.701798 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x4qsz"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.708146 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8f9f4"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.715210 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m"] Dec 03 13:03:43 crc kubenswrapper[4666]: I1203 13:03:43.721993 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mt54m"] Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.440600 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac75587-f1c3-421b-b935-60ef80d9eb99" path="/var/lib/kubelet/pods/3ac75587-f1c3-421b-b935-60ef80d9eb99/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.442259 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4425997f-9f2e-437e-8c5b-1754d4f7abac" path="/var/lib/kubelet/pods/4425997f-9f2e-437e-8c5b-1754d4f7abac/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.443469 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4621207e-db3e-4ef5-b234-4d9e7443ff87" path="/var/lib/kubelet/pods/4621207e-db3e-4ef5-b234-4d9e7443ff87/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.444626 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e41280-1166-4853-b3f9-83436578577b" path="/var/lib/kubelet/pods/49e41280-1166-4853-b3f9-83436578577b/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.446703 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe8a0fb-f506-4363-9006-23cd005d0e78" path="/var/lib/kubelet/pods/4fe8a0fb-f506-4363-9006-23cd005d0e78/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.447837 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba9df91-2576-46bc-ae29-daaf447e1ccf" path="/var/lib/kubelet/pods/6ba9df91-2576-46bc-ae29-daaf447e1ccf/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.449019 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86cef496-ab56-4162-b028-9d05332eb53c" path="/var/lib/kubelet/pods/86cef496-ab56-4162-b028-9d05332eb53c/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.451131 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c766de6f-79d7-4dc4-ae16-85a648bf8eb5" path="/var/lib/kubelet/pods/c766de6f-79d7-4dc4-ae16-85a648bf8eb5/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.452253 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510" path="/var/lib/kubelet/pods/c9c585ec-0b99-4b5c-bd1c-4a7c8ece3510/volumes" Dec 03 13:03:45 crc kubenswrapper[4666]: I1203 13:03:45.453507 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18b806a-b9ab-4edd-83df-d8298d026d6e" path="/var/lib/kubelet/pods/f18b806a-b9ab-4edd-83df-d8298d026d6e/volumes" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.433183 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j"] Dec 03 13:03:49 crc kubenswrapper[4666]: E1203 13:03:49.433835 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="registry-server" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.433851 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="registry-server" Dec 03 13:03:49 crc kubenswrapper[4666]: E1203 13:03:49.433872 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867a9f13-2579-4e34-9c29-97847041400d" containerName="keystone-cron" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.433878 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="867a9f13-2579-4e34-9c29-97847041400d" containerName="keystone-cron" Dec 03 13:03:49 crc kubenswrapper[4666]: E1203 13:03:49.433898 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="extract-content" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.433903 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="extract-content" Dec 03 13:03:49 crc kubenswrapper[4666]: E1203 13:03:49.433914 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="extract-utilities" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.433920 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="extract-utilities" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.434160 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="867a9f13-2579-4e34-9c29-97847041400d" containerName="keystone-cron" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.434177 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c2ec97-7692-4f1f-93b6-7a0f01bb6db1" containerName="registry-server" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.434764 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.436923 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.436944 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.437514 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.443783 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.443871 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.444598 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j"] Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.582492 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.582619 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdw46\" (UniqueName: \"kubernetes.io/projected/de91e472-2cc8-4eaf-91a3-49719f18e3f3-kube-api-access-bdw46\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.583331 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.583438 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.583573 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.679308 4666 scope.go:117] "RemoveContainer" containerID="ddf643fc384b3475726bbebee3e674ff3310fd46caf67465d9e2b5e76caf5f8b" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.684873 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.684996 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdw46\" (UniqueName: \"kubernetes.io/projected/de91e472-2cc8-4eaf-91a3-49719f18e3f3-kube-api-access-bdw46\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.685041 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.685070 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.685149 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.691607 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.692200 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.692375 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.694239 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.702143 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdw46\" (UniqueName: \"kubernetes.io/projected/de91e472-2cc8-4eaf-91a3-49719f18e3f3-kube-api-access-bdw46\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.705733 4666 scope.go:117] "RemoveContainer" containerID="d35d04e69f59b062d1eb964215373f2832a7c45275f47e6642d6aa0a30579199" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.785978 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.810696 4666 scope.go:117] "RemoveContainer" containerID="4b43bfd83e0f2efba46abee6c4cb646a1157a7edc5c6c6e64f0a5f55fe18cbdc" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.840852 4666 scope.go:117] "RemoveContainer" containerID="8364d2c22d0c906da098c9da4476088fa324d11c4a86514afcb57dcedefe42c5" Dec 03 13:03:49 crc kubenswrapper[4666]: I1203 13:03:49.937184 4666 scope.go:117] "RemoveContainer" containerID="f54d0a707f5883191ad902c738892e2be8a3fdee4bb26dd9be042abc0741dfde" Dec 03 13:03:50 crc kubenswrapper[4666]: I1203 13:03:50.003456 4666 scope.go:117] "RemoveContainer" containerID="08fbb4a9e5509c6b63897c658c496ba120bbfa9281740ac9261c5e01cfdea84d" Dec 03 13:03:50 crc kubenswrapper[4666]: I1203 13:03:50.040396 4666 scope.go:117] "RemoveContainer" containerID="f4fefe55b0034ba41212fb2a433c5a8321977b5d52b2f78d95dc47d4278cdf6c" Dec 03 13:03:50 crc kubenswrapper[4666]: I1203 13:03:50.329552 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j"] Dec 03 13:03:51 crc kubenswrapper[4666]: I1203 13:03:51.059686 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" event={"ID":"de91e472-2cc8-4eaf-91a3-49719f18e3f3","Type":"ContainerStarted","Data":"f991254683c81af8f31f0cbfcfbf29ed28f3c41100dabb578828beeae242eed7"} Dec 03 13:03:53 crc kubenswrapper[4666]: I1203 13:03:53.084402 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" event={"ID":"de91e472-2cc8-4eaf-91a3-49719f18e3f3","Type":"ContainerStarted","Data":"c9b1893082eaa5eb92b871a42a338adec48872523ccab79f6f55c3e07cfbe8ae"} Dec 03 13:03:53 crc kubenswrapper[4666]: I1203 13:03:53.102503 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" podStartSLOduration=2.020728914 podStartE2EDuration="4.102483041s" podCreationTimestamp="2025-12-03 13:03:49 +0000 UTC" firstStartedPulling="2025-12-03 13:03:50.3345066 +0000 UTC m=+3019.179467651" lastFinishedPulling="2025-12-03 13:03:52.416260727 +0000 UTC m=+3021.261221778" observedRunningTime="2025-12-03 13:03:53.097077925 +0000 UTC m=+3021.942038976" watchObservedRunningTime="2025-12-03 13:03:53.102483041 +0000 UTC m=+3021.947444102" Dec 03 13:04:05 crc kubenswrapper[4666]: I1203 13:04:05.180413 4666 generic.go:334] "Generic (PLEG): container finished" podID="de91e472-2cc8-4eaf-91a3-49719f18e3f3" containerID="c9b1893082eaa5eb92b871a42a338adec48872523ccab79f6f55c3e07cfbe8ae" exitCode=0 Dec 03 13:04:05 crc kubenswrapper[4666]: I1203 13:04:05.180617 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" event={"ID":"de91e472-2cc8-4eaf-91a3-49719f18e3f3","Type":"ContainerDied","Data":"c9b1893082eaa5eb92b871a42a338adec48872523ccab79f6f55c3e07cfbe8ae"} Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.625351 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.701800 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ceph\") pod \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.701882 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ssh-key\") pod \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.702050 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-repo-setup-combined-ca-bundle\") pod \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.702159 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-inventory\") pod \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.702253 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdw46\" (UniqueName: \"kubernetes.io/projected/de91e472-2cc8-4eaf-91a3-49719f18e3f3-kube-api-access-bdw46\") pod \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\" (UID: \"de91e472-2cc8-4eaf-91a3-49719f18e3f3\") " Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.710143 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ceph" (OuterVolumeSpecName: "ceph") pod "de91e472-2cc8-4eaf-91a3-49719f18e3f3" (UID: "de91e472-2cc8-4eaf-91a3-49719f18e3f3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.710171 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de91e472-2cc8-4eaf-91a3-49719f18e3f3-kube-api-access-bdw46" (OuterVolumeSpecName: "kube-api-access-bdw46") pod "de91e472-2cc8-4eaf-91a3-49719f18e3f3" (UID: "de91e472-2cc8-4eaf-91a3-49719f18e3f3"). InnerVolumeSpecName "kube-api-access-bdw46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.710299 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "de91e472-2cc8-4eaf-91a3-49719f18e3f3" (UID: "de91e472-2cc8-4eaf-91a3-49719f18e3f3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.738840 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-inventory" (OuterVolumeSpecName: "inventory") pod "de91e472-2cc8-4eaf-91a3-49719f18e3f3" (UID: "de91e472-2cc8-4eaf-91a3-49719f18e3f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.756734 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "de91e472-2cc8-4eaf-91a3-49719f18e3f3" (UID: "de91e472-2cc8-4eaf-91a3-49719f18e3f3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.804737 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.804775 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.804791 4666 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.804804 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de91e472-2cc8-4eaf-91a3-49719f18e3f3-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:04:06 crc kubenswrapper[4666]: I1203 13:04:06.804814 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdw46\" (UniqueName: \"kubernetes.io/projected/de91e472-2cc8-4eaf-91a3-49719f18e3f3-kube-api-access-bdw46\") on node \"crc\" DevicePath \"\"" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.211185 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" event={"ID":"de91e472-2cc8-4eaf-91a3-49719f18e3f3","Type":"ContainerDied","Data":"f991254683c81af8f31f0cbfcfbf29ed28f3c41100dabb578828beeae242eed7"} Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.211271 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f991254683c81af8f31f0cbfcfbf29ed28f3c41100dabb578828beeae242eed7" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.211333 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.286055 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj"] Dec 03 13:04:07 crc kubenswrapper[4666]: E1203 13:04:07.286538 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de91e472-2cc8-4eaf-91a3-49719f18e3f3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.286567 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="de91e472-2cc8-4eaf-91a3-49719f18e3f3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.286810 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="de91e472-2cc8-4eaf-91a3-49719f18e3f3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.287557 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.291605 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.291934 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.307819 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.308062 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.308481 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.309552 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj"] Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.417070 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.417134 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.417209 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.417240 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkmm\" (UniqueName: \"kubernetes.io/projected/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-kube-api-access-nqkmm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.417335 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.519899 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.519995 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkmm\" (UniqueName: \"kubernetes.io/projected/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-kube-api-access-nqkmm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.520035 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.520178 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.520208 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.525913 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.525913 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.526236 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.528175 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.541347 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkmm\" (UniqueName: \"kubernetes.io/projected/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-kube-api-access-nqkmm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:07 crc kubenswrapper[4666]: I1203 13:04:07.604798 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:04:08 crc kubenswrapper[4666]: I1203 13:04:08.157737 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj"] Dec 03 13:04:08 crc kubenswrapper[4666]: I1203 13:04:08.221821 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" event={"ID":"1bcf0d09-1d7c-4a79-a477-f10b1584bc42","Type":"ContainerStarted","Data":"cca241764230803b298a414ca9effb6a5e8fa09c4b1afc3dbc2af9bb2b56e665"} Dec 03 13:04:09 crc kubenswrapper[4666]: I1203 13:04:09.231276 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" event={"ID":"1bcf0d09-1d7c-4a79-a477-f10b1584bc42","Type":"ContainerStarted","Data":"87d49ea53d6cc2779c6398b7512a4e18d4151e9eed14958ed41ea44d212da5c1"} Dec 03 13:04:09 crc kubenswrapper[4666]: I1203 13:04:09.251988 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" podStartSLOduration=1.7923024490000001 podStartE2EDuration="2.251971612s" podCreationTimestamp="2025-12-03 13:04:07 +0000 UTC" firstStartedPulling="2025-12-03 13:04:08.164046816 +0000 UTC m=+3037.009007857" lastFinishedPulling="2025-12-03 13:04:08.623715959 +0000 UTC m=+3037.468677020" observedRunningTime="2025-12-03 13:04:09.24704131 +0000 UTC m=+3038.092002361" watchObservedRunningTime="2025-12-03 13:04:09.251971612 +0000 UTC m=+3038.096932663" Dec 03 13:04:39 crc kubenswrapper[4666]: I1203 13:04:39.866166 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:04:39 crc kubenswrapper[4666]: I1203 13:04:39.866776 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:04:50 crc kubenswrapper[4666]: I1203 13:04:50.190874 4666 scope.go:117] "RemoveContainer" containerID="478a52901eab6d8affb9713243a844edfa72cf06b78f712397086b665a997af0" Dec 03 13:04:50 crc kubenswrapper[4666]: I1203 13:04:50.236758 4666 scope.go:117] "RemoveContainer" containerID="08c41bee9a85eb9c9121aa899a764e4b287096b4a5e4287a44df9522a708434a" Dec 03 13:04:50 crc kubenswrapper[4666]: I1203 13:04:50.298549 4666 scope.go:117] "RemoveContainer" containerID="70ed663aefee3bedecc88814c9e13984ee7de31a1c843b79fb5c13c417638399" Dec 03 13:05:09 crc kubenswrapper[4666]: I1203 13:05:09.865792 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:05:09 crc kubenswrapper[4666]: I1203 13:05:09.866314 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:05:39 crc kubenswrapper[4666]: I1203 13:05:39.866534 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:05:39 crc kubenswrapper[4666]: I1203 13:05:39.867062 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:05:39 crc kubenswrapper[4666]: I1203 13:05:39.867145 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:05:39 crc kubenswrapper[4666]: I1203 13:05:39.867868 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:05:39 crc kubenswrapper[4666]: I1203 13:05:39.867927 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" gracePeriod=600 Dec 03 13:05:40 crc kubenswrapper[4666]: E1203 13:05:40.001901 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:05:40 crc kubenswrapper[4666]: I1203 13:05:40.037628 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" exitCode=0 Dec 03 13:05:40 crc kubenswrapper[4666]: I1203 13:05:40.037679 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd"} Dec 03 13:05:40 crc kubenswrapper[4666]: I1203 13:05:40.037720 4666 scope.go:117] "RemoveContainer" containerID="b547b766ea1db48dd959cb6becbb43e323c6d6a5ccfa1ece53ac12bb61932852" Dec 03 13:05:40 crc kubenswrapper[4666]: I1203 13:05:40.038425 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:05:40 crc kubenswrapper[4666]: E1203 13:05:40.039834 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:05:47 crc kubenswrapper[4666]: I1203 13:05:47.098753 4666 generic.go:334] "Generic (PLEG): container finished" podID="1bcf0d09-1d7c-4a79-a477-f10b1584bc42" containerID="87d49ea53d6cc2779c6398b7512a4e18d4151e9eed14958ed41ea44d212da5c1" exitCode=0 Dec 03 13:05:47 crc kubenswrapper[4666]: I1203 13:05:47.098843 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" event={"ID":"1bcf0d09-1d7c-4a79-a477-f10b1584bc42","Type":"ContainerDied","Data":"87d49ea53d6cc2779c6398b7512a4e18d4151e9eed14958ed41ea44d212da5c1"} Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.509926 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.541598 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-inventory\") pod \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.541954 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ceph\") pod \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.541996 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqkmm\" (UniqueName: \"kubernetes.io/projected/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-kube-api-access-nqkmm\") pod \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.542059 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ssh-key\") pod \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.542105 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-bootstrap-combined-ca-bundle\") pod \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\" (UID: \"1bcf0d09-1d7c-4a79-a477-f10b1584bc42\") " Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.547521 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ceph" (OuterVolumeSpecName: "ceph") pod "1bcf0d09-1d7c-4a79-a477-f10b1584bc42" (UID: "1bcf0d09-1d7c-4a79-a477-f10b1584bc42"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.548054 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-kube-api-access-nqkmm" (OuterVolumeSpecName: "kube-api-access-nqkmm") pod "1bcf0d09-1d7c-4a79-a477-f10b1584bc42" (UID: "1bcf0d09-1d7c-4a79-a477-f10b1584bc42"). InnerVolumeSpecName "kube-api-access-nqkmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.555198 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1bcf0d09-1d7c-4a79-a477-f10b1584bc42" (UID: "1bcf0d09-1d7c-4a79-a477-f10b1584bc42"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.576304 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-inventory" (OuterVolumeSpecName: "inventory") pod "1bcf0d09-1d7c-4a79-a477-f10b1584bc42" (UID: "1bcf0d09-1d7c-4a79-a477-f10b1584bc42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.578656 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1bcf0d09-1d7c-4a79-a477-f10b1584bc42" (UID: "1bcf0d09-1d7c-4a79-a477-f10b1584bc42"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.643259 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.643299 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqkmm\" (UniqueName: \"kubernetes.io/projected/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-kube-api-access-nqkmm\") on node \"crc\" DevicePath \"\"" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.643313 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.643322 4666 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:05:48 crc kubenswrapper[4666]: I1203 13:05:48.643333 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bcf0d09-1d7c-4a79-a477-f10b1584bc42-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.125679 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" event={"ID":"1bcf0d09-1d7c-4a79-a477-f10b1584bc42","Type":"ContainerDied","Data":"cca241764230803b298a414ca9effb6a5e8fa09c4b1afc3dbc2af9bb2b56e665"} Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.125725 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca241764230803b298a414ca9effb6a5e8fa09c4b1afc3dbc2af9bb2b56e665" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.125732 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.204061 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs"] Dec 03 13:05:49 crc kubenswrapper[4666]: E1203 13:05:49.204457 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bcf0d09-1d7c-4a79-a477-f10b1584bc42" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.204474 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bcf0d09-1d7c-4a79-a477-f10b1584bc42" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.204644 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bcf0d09-1d7c-4a79-a477-f10b1584bc42" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.205266 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.207237 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.207527 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.208078 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.209191 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.218585 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.218700 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs"] Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.252398 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.252445 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.252470 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.252499 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-kube-api-access-lqzd2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.353781 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-kube-api-access-lqzd2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.353958 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.354001 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.354038 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.357884 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.359505 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.359808 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.375195 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-kube-api-access-lqzd2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:49 crc kubenswrapper[4666]: I1203 13:05:49.524101 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:05:50 crc kubenswrapper[4666]: I1203 13:05:50.157851 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs"] Dec 03 13:05:50 crc kubenswrapper[4666]: I1203 13:05:50.165039 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:05:51 crc kubenswrapper[4666]: I1203 13:05:51.142365 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" event={"ID":"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4","Type":"ContainerStarted","Data":"9e4a61c4c651bbee81f076df76a8eadcc6e4e19eccf53446329c9ff1fc11f294"} Dec 03 13:05:51 crc kubenswrapper[4666]: I1203 13:05:51.142761 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" event={"ID":"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4","Type":"ContainerStarted","Data":"48779cf2abc4cfe6c240d371c1d7815fea48b22c216c9e871da34a2c8c26cc64"} Dec 03 13:05:51 crc kubenswrapper[4666]: I1203 13:05:51.169235 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" podStartSLOduration=1.733504025 podStartE2EDuration="2.169212573s" podCreationTimestamp="2025-12-03 13:05:49 +0000 UTC" firstStartedPulling="2025-12-03 13:05:50.164779619 +0000 UTC m=+3139.009740670" lastFinishedPulling="2025-12-03 13:05:50.600488167 +0000 UTC m=+3139.445449218" observedRunningTime="2025-12-03 13:05:51.163189831 +0000 UTC m=+3140.008150892" watchObservedRunningTime="2025-12-03 13:05:51.169212573 +0000 UTC m=+3140.014173624" Dec 03 13:05:54 crc kubenswrapper[4666]: I1203 13:05:54.423706 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:05:54 crc kubenswrapper[4666]: E1203 13:05:54.424491 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:06:08 crc kubenswrapper[4666]: I1203 13:06:08.424042 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:06:08 crc kubenswrapper[4666]: E1203 13:06:08.424853 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:06:16 crc kubenswrapper[4666]: I1203 13:06:16.365218 4666 generic.go:334] "Generic (PLEG): container finished" podID="55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" containerID="9e4a61c4c651bbee81f076df76a8eadcc6e4e19eccf53446329c9ff1fc11f294" exitCode=0 Dec 03 13:06:16 crc kubenswrapper[4666]: I1203 13:06:16.365340 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" event={"ID":"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4","Type":"ContainerDied","Data":"9e4a61c4c651bbee81f076df76a8eadcc6e4e19eccf53446329c9ff1fc11f294"} Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.770985 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.787642 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-inventory\") pod \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.787859 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ssh-key\") pod \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.787943 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-kube-api-access-lqzd2\") pod \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.787997 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ceph\") pod \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\" (UID: \"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4\") " Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.796284 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-kube-api-access-lqzd2" (OuterVolumeSpecName: "kube-api-access-lqzd2") pod "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" (UID: "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4"). InnerVolumeSpecName "kube-api-access-lqzd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.797314 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ceph" (OuterVolumeSpecName: "ceph") pod "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" (UID: "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.818930 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" (UID: "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.819275 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-inventory" (OuterVolumeSpecName: "inventory") pod "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" (UID: "55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.893984 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.894016 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.894026 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-kube-api-access-lqzd2\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:17 crc kubenswrapper[4666]: I1203 13:06:17.894037 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.380899 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" event={"ID":"55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4","Type":"ContainerDied","Data":"48779cf2abc4cfe6c240d371c1d7815fea48b22c216c9e871da34a2c8c26cc64"} Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.380942 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48779cf2abc4cfe6c240d371c1d7815fea48b22c216c9e871da34a2c8c26cc64" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.380986 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.479281 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh"] Dec 03 13:06:18 crc kubenswrapper[4666]: E1203 13:06:18.480055 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.480074 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.480311 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.481002 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.481815 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh"] Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.483203 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.483652 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.484194 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.484474 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.486429 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.605695 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlln\" (UniqueName: \"kubernetes.io/projected/acf58997-af21-4832-a74c-f81057c84d08-kube-api-access-crlln\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.605792 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.605840 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.605989 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.708234 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.708378 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlln\" (UniqueName: \"kubernetes.io/projected/acf58997-af21-4832-a74c-f81057c84d08-kube-api-access-crlln\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.708694 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.708779 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.713551 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.713551 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.723022 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.729136 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlln\" (UniqueName: \"kubernetes.io/projected/acf58997-af21-4832-a74c-f81057c84d08-kube-api-access-crlln\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9drzh\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:18 crc kubenswrapper[4666]: I1203 13:06:18.802401 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:19 crc kubenswrapper[4666]: I1203 13:06:19.320055 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh"] Dec 03 13:06:19 crc kubenswrapper[4666]: I1203 13:06:19.390626 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" event={"ID":"acf58997-af21-4832-a74c-f81057c84d08","Type":"ContainerStarted","Data":"d2ef0afd1e5f78d3706dba9e350fe8402e8766e6b8bc5babdb86b17e7a4576ff"} Dec 03 13:06:22 crc kubenswrapper[4666]: I1203 13:06:22.445826 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" event={"ID":"acf58997-af21-4832-a74c-f81057c84d08","Type":"ContainerStarted","Data":"d49eb939f4337d9d91ae622983f32edb243815968dbf4d407e4bcd036b18e76e"} Dec 03 13:06:22 crc kubenswrapper[4666]: I1203 13:06:22.465866 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" podStartSLOduration=2.483934784 podStartE2EDuration="4.465848235s" podCreationTimestamp="2025-12-03 13:06:18 +0000 UTC" firstStartedPulling="2025-12-03 13:06:19.325647507 +0000 UTC m=+3168.170608558" lastFinishedPulling="2025-12-03 13:06:21.307560958 +0000 UTC m=+3170.152522009" observedRunningTime="2025-12-03 13:06:22.460913732 +0000 UTC m=+3171.305874793" watchObservedRunningTime="2025-12-03 13:06:22.465848235 +0000 UTC m=+3171.310809296" Dec 03 13:06:23 crc kubenswrapper[4666]: I1203 13:06:23.424071 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:06:23 crc kubenswrapper[4666]: E1203 13:06:23.424780 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:06:26 crc kubenswrapper[4666]: I1203 13:06:26.479605 4666 generic.go:334] "Generic (PLEG): container finished" podID="acf58997-af21-4832-a74c-f81057c84d08" containerID="d49eb939f4337d9d91ae622983f32edb243815968dbf4d407e4bcd036b18e76e" exitCode=0 Dec 03 13:06:26 crc kubenswrapper[4666]: I1203 13:06:26.479699 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" event={"ID":"acf58997-af21-4832-a74c-f81057c84d08","Type":"ContainerDied","Data":"d49eb939f4337d9d91ae622983f32edb243815968dbf4d407e4bcd036b18e76e"} Dec 03 13:06:27 crc kubenswrapper[4666]: I1203 13:06:27.865951 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.032358 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crlln\" (UniqueName: \"kubernetes.io/projected/acf58997-af21-4832-a74c-f81057c84d08-kube-api-access-crlln\") pod \"acf58997-af21-4832-a74c-f81057c84d08\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.035202 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-inventory\") pod \"acf58997-af21-4832-a74c-f81057c84d08\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.035452 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ssh-key\") pod \"acf58997-af21-4832-a74c-f81057c84d08\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.035576 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ceph\") pod \"acf58997-af21-4832-a74c-f81057c84d08\" (UID: \"acf58997-af21-4832-a74c-f81057c84d08\") " Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.041386 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ceph" (OuterVolumeSpecName: "ceph") pod "acf58997-af21-4832-a74c-f81057c84d08" (UID: "acf58997-af21-4832-a74c-f81057c84d08"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.041384 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf58997-af21-4832-a74c-f81057c84d08-kube-api-access-crlln" (OuterVolumeSpecName: "kube-api-access-crlln") pod "acf58997-af21-4832-a74c-f81057c84d08" (UID: "acf58997-af21-4832-a74c-f81057c84d08"). InnerVolumeSpecName "kube-api-access-crlln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.060898 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-inventory" (OuterVolumeSpecName: "inventory") pod "acf58997-af21-4832-a74c-f81057c84d08" (UID: "acf58997-af21-4832-a74c-f81057c84d08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.066716 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "acf58997-af21-4832-a74c-f81057c84d08" (UID: "acf58997-af21-4832-a74c-f81057c84d08"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.138223 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crlln\" (UniqueName: \"kubernetes.io/projected/acf58997-af21-4832-a74c-f81057c84d08-kube-api-access-crlln\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.138272 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.138283 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.138300 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acf58997-af21-4832-a74c-f81057c84d08-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.499317 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" event={"ID":"acf58997-af21-4832-a74c-f81057c84d08","Type":"ContainerDied","Data":"d2ef0afd1e5f78d3706dba9e350fe8402e8766e6b8bc5babdb86b17e7a4576ff"} Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.499360 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2ef0afd1e5f78d3706dba9e350fe8402e8766e6b8bc5babdb86b17e7a4576ff" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.499397 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9drzh" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.570800 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2"] Dec 03 13:06:28 crc kubenswrapper[4666]: E1203 13:06:28.571341 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf58997-af21-4832-a74c-f81057c84d08" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.571368 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf58997-af21-4832-a74c-f81057c84d08" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.571581 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf58997-af21-4832-a74c-f81057c84d08" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.572969 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.576607 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.576696 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.576734 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.576754 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.576845 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.587169 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2"] Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.646309 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.646438 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.646518 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.646560 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cshx4\" (UniqueName: \"kubernetes.io/projected/19109296-aca4-46fa-95d5-70dcd8604ab7-kube-api-access-cshx4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.747613 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.747702 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.747741 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cshx4\" (UniqueName: \"kubernetes.io/projected/19109296-aca4-46fa-95d5-70dcd8604ab7-kube-api-access-cshx4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.747772 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.751442 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.751993 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.752498 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.763821 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cshx4\" (UniqueName: \"kubernetes.io/projected/19109296-aca4-46fa-95d5-70dcd8604ab7-kube-api-access-cshx4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r2rn2\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:28 crc kubenswrapper[4666]: I1203 13:06:28.888183 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:06:29 crc kubenswrapper[4666]: I1203 13:06:29.392625 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2"] Dec 03 13:06:29 crc kubenswrapper[4666]: I1203 13:06:29.520596 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" event={"ID":"19109296-aca4-46fa-95d5-70dcd8604ab7","Type":"ContainerStarted","Data":"43de6346d3a609b2b9b1475d610b94fa7d5f2e078c09d09bb679a120853033a0"} Dec 03 13:06:30 crc kubenswrapper[4666]: I1203 13:06:30.530528 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" event={"ID":"19109296-aca4-46fa-95d5-70dcd8604ab7","Type":"ContainerStarted","Data":"e9a226a99862af88fdd2e030957528e7d6a9eb29ee3896753aa07696d23af448"} Dec 03 13:06:30 crc kubenswrapper[4666]: I1203 13:06:30.548893 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" podStartSLOduration=2.106057062 podStartE2EDuration="2.548871962s" podCreationTimestamp="2025-12-03 13:06:28 +0000 UTC" firstStartedPulling="2025-12-03 13:06:29.399876957 +0000 UTC m=+3178.244838008" lastFinishedPulling="2025-12-03 13:06:29.842691857 +0000 UTC m=+3178.687652908" observedRunningTime="2025-12-03 13:06:30.544280818 +0000 UTC m=+3179.389241879" watchObservedRunningTime="2025-12-03 13:06:30.548871962 +0000 UTC m=+3179.393833013" Dec 03 13:06:34 crc kubenswrapper[4666]: I1203 13:06:34.424585 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:06:34 crc kubenswrapper[4666]: E1203 13:06:34.425155 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:06:45 crc kubenswrapper[4666]: I1203 13:06:45.424274 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:06:45 crc kubenswrapper[4666]: E1203 13:06:45.425443 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:07:00 crc kubenswrapper[4666]: I1203 13:07:00.423125 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:07:00 crc kubenswrapper[4666]: E1203 13:07:00.424057 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:07:04 crc kubenswrapper[4666]: I1203 13:07:04.815839 4666 generic.go:334] "Generic (PLEG): container finished" podID="19109296-aca4-46fa-95d5-70dcd8604ab7" containerID="e9a226a99862af88fdd2e030957528e7d6a9eb29ee3896753aa07696d23af448" exitCode=0 Dec 03 13:07:04 crc kubenswrapper[4666]: I1203 13:07:04.815926 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" event={"ID":"19109296-aca4-46fa-95d5-70dcd8604ab7","Type":"ContainerDied","Data":"e9a226a99862af88fdd2e030957528e7d6a9eb29ee3896753aa07696d23af448"} Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.251060 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.338485 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ceph\") pod \"19109296-aca4-46fa-95d5-70dcd8604ab7\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.338689 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ssh-key\") pod \"19109296-aca4-46fa-95d5-70dcd8604ab7\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.338738 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-inventory\") pod \"19109296-aca4-46fa-95d5-70dcd8604ab7\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.339074 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cshx4\" (UniqueName: \"kubernetes.io/projected/19109296-aca4-46fa-95d5-70dcd8604ab7-kube-api-access-cshx4\") pod \"19109296-aca4-46fa-95d5-70dcd8604ab7\" (UID: \"19109296-aca4-46fa-95d5-70dcd8604ab7\") " Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.350723 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ceph" (OuterVolumeSpecName: "ceph") pod "19109296-aca4-46fa-95d5-70dcd8604ab7" (UID: "19109296-aca4-46fa-95d5-70dcd8604ab7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.350741 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19109296-aca4-46fa-95d5-70dcd8604ab7-kube-api-access-cshx4" (OuterVolumeSpecName: "kube-api-access-cshx4") pod "19109296-aca4-46fa-95d5-70dcd8604ab7" (UID: "19109296-aca4-46fa-95d5-70dcd8604ab7"). InnerVolumeSpecName "kube-api-access-cshx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.381071 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19109296-aca4-46fa-95d5-70dcd8604ab7" (UID: "19109296-aca4-46fa-95d5-70dcd8604ab7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.387422 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-inventory" (OuterVolumeSpecName: "inventory") pod "19109296-aca4-46fa-95d5-70dcd8604ab7" (UID: "19109296-aca4-46fa-95d5-70dcd8604ab7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.441937 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cshx4\" (UniqueName: \"kubernetes.io/projected/19109296-aca4-46fa-95d5-70dcd8604ab7-kube-api-access-cshx4\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.441983 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.441994 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.442006 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19109296-aca4-46fa-95d5-70dcd8604ab7-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.837601 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" event={"ID":"19109296-aca4-46fa-95d5-70dcd8604ab7","Type":"ContainerDied","Data":"43de6346d3a609b2b9b1475d610b94fa7d5f2e078c09d09bb679a120853033a0"} Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.837912 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43de6346d3a609b2b9b1475d610b94fa7d5f2e078c09d09bb679a120853033a0" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.837685 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r2rn2" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.909542 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf"] Dec 03 13:07:06 crc kubenswrapper[4666]: E1203 13:07:06.909983 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19109296-aca4-46fa-95d5-70dcd8604ab7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.910005 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="19109296-aca4-46fa-95d5-70dcd8604ab7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.910251 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="19109296-aca4-46fa-95d5-70dcd8604ab7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.910953 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.915909 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.916275 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.916440 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.916608 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.916763 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:07:06 crc kubenswrapper[4666]: I1203 13:07:06.940823 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf"] Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.056041 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.056234 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.056293 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97gfv\" (UniqueName: \"kubernetes.io/projected/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-kube-api-access-97gfv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.056381 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.157758 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.158123 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.158246 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97gfv\" (UniqueName: \"kubernetes.io/projected/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-kube-api-access-97gfv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.158459 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.162817 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.162980 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.163218 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.180725 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97gfv\" (UniqueName: \"kubernetes.io/projected/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-kube-api-access-97gfv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.235510 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.811445 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf"] Dec 03 13:07:07 crc kubenswrapper[4666]: I1203 13:07:07.847639 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" event={"ID":"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6","Type":"ContainerStarted","Data":"76ab3d1be722cdf36988a91ec117fa9d1b3d9ffea11eb22ddacf67d429c0d0e2"} Dec 03 13:07:08 crc kubenswrapper[4666]: I1203 13:07:08.861504 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" event={"ID":"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6","Type":"ContainerStarted","Data":"55d5bb69822f0d676c073726a27e51fd7ec1ff483f73e7f87de16ad03993cece"} Dec 03 13:07:08 crc kubenswrapper[4666]: I1203 13:07:08.893061 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" podStartSLOduration=2.159714332 podStartE2EDuration="2.893042941s" podCreationTimestamp="2025-12-03 13:07:06 +0000 UTC" firstStartedPulling="2025-12-03 13:07:07.819154662 +0000 UTC m=+3216.664115713" lastFinishedPulling="2025-12-03 13:07:08.552483271 +0000 UTC m=+3217.397444322" observedRunningTime="2025-12-03 13:07:08.886633078 +0000 UTC m=+3217.731594129" watchObservedRunningTime="2025-12-03 13:07:08.893042941 +0000 UTC m=+3217.738003992" Dec 03 13:07:12 crc kubenswrapper[4666]: I1203 13:07:12.891618 4666 generic.go:334] "Generic (PLEG): container finished" podID="01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" containerID="55d5bb69822f0d676c073726a27e51fd7ec1ff483f73e7f87de16ad03993cece" exitCode=0 Dec 03 13:07:12 crc kubenswrapper[4666]: I1203 13:07:12.891705 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" event={"ID":"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6","Type":"ContainerDied","Data":"55d5bb69822f0d676c073726a27e51fd7ec1ff483f73e7f87de16ad03993cece"} Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.329212 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.424478 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:07:14 crc kubenswrapper[4666]: E1203 13:07:14.425150 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.523220 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ceph\") pod \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.524367 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-inventory\") pod \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.524560 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97gfv\" (UniqueName: \"kubernetes.io/projected/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-kube-api-access-97gfv\") pod \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.524603 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ssh-key\") pod \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\" (UID: \"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6\") " Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.530209 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ceph" (OuterVolumeSpecName: "ceph") pod "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" (UID: "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.530281 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-kube-api-access-97gfv" (OuterVolumeSpecName: "kube-api-access-97gfv") pod "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" (UID: "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6"). InnerVolumeSpecName "kube-api-access-97gfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.552311 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-inventory" (OuterVolumeSpecName: "inventory") pod "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" (UID: "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.566556 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" (UID: "01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.626175 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.626210 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.626223 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.626233 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97gfv\" (UniqueName: \"kubernetes.io/projected/01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6-kube-api-access-97gfv\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.913677 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" event={"ID":"01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6","Type":"ContainerDied","Data":"76ab3d1be722cdf36988a91ec117fa9d1b3d9ffea11eb22ddacf67d429c0d0e2"} Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.913720 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ab3d1be722cdf36988a91ec117fa9d1b3d9ffea11eb22ddacf67d429c0d0e2" Dec 03 13:07:14 crc kubenswrapper[4666]: I1203 13:07:14.913767 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.022526 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx"] Dec 03 13:07:15 crc kubenswrapper[4666]: E1203 13:07:15.022898 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.022916 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.023127 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.023681 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.026483 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.026538 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.029425 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.029883 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.035372 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.036152 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx"] Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.134256 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.134328 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.134622 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw9qp\" (UniqueName: \"kubernetes.io/projected/49d2d5f0-5c89-4847-856e-cf9ed17510ec-kube-api-access-dw9qp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.134727 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.236419 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.236569 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.236613 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.236657 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw9qp\" (UniqueName: \"kubernetes.io/projected/49d2d5f0-5c89-4847-856e-cf9ed17510ec-kube-api-access-dw9qp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.242063 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.242664 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.242799 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.260016 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw9qp\" (UniqueName: \"kubernetes.io/projected/49d2d5f0-5c89-4847-856e-cf9ed17510ec-kube-api-access-dw9qp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.340258 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.659685 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx"] Dec 03 13:07:15 crc kubenswrapper[4666]: I1203 13:07:15.925678 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" event={"ID":"49d2d5f0-5c89-4847-856e-cf9ed17510ec","Type":"ContainerStarted","Data":"dd7728a40278fec77c6dd34fff4f879bf3a5dd1e6ab4d28d9fc970c5fe53c51a"} Dec 03 13:07:16 crc kubenswrapper[4666]: I1203 13:07:16.948453 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" event={"ID":"49d2d5f0-5c89-4847-856e-cf9ed17510ec","Type":"ContainerStarted","Data":"0de0e7fb8ba23e53376380b50a6d062d25ab34a35005ae18e4b5ffc66f243152"} Dec 03 13:07:16 crc kubenswrapper[4666]: I1203 13:07:16.974573 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" podStartSLOduration=1.539797207 podStartE2EDuration="1.974553908s" podCreationTimestamp="2025-12-03 13:07:15 +0000 UTC" firstStartedPulling="2025-12-03 13:07:15.660443197 +0000 UTC m=+3224.505404248" lastFinishedPulling="2025-12-03 13:07:16.095199898 +0000 UTC m=+3224.940160949" observedRunningTime="2025-12-03 13:07:16.968945286 +0000 UTC m=+3225.813906377" watchObservedRunningTime="2025-12-03 13:07:16.974553908 +0000 UTC m=+3225.819514959" Dec 03 13:07:26 crc kubenswrapper[4666]: I1203 13:07:26.423309 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:07:26 crc kubenswrapper[4666]: E1203 13:07:26.424177 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:07:38 crc kubenswrapper[4666]: I1203 13:07:38.424499 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:07:38 crc kubenswrapper[4666]: E1203 13:07:38.425566 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:07:50 crc kubenswrapper[4666]: I1203 13:07:50.422964 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:07:50 crc kubenswrapper[4666]: E1203 13:07:50.424999 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:07:58 crc kubenswrapper[4666]: I1203 13:07:58.295197 4666 generic.go:334] "Generic (PLEG): container finished" podID="49d2d5f0-5c89-4847-856e-cf9ed17510ec" containerID="0de0e7fb8ba23e53376380b50a6d062d25ab34a35005ae18e4b5ffc66f243152" exitCode=0 Dec 03 13:07:58 crc kubenswrapper[4666]: I1203 13:07:58.295319 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" event={"ID":"49d2d5f0-5c89-4847-856e-cf9ed17510ec","Type":"ContainerDied","Data":"0de0e7fb8ba23e53376380b50a6d062d25ab34a35005ae18e4b5ffc66f243152"} Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.670887 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.757043 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-inventory\") pod \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.757263 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ssh-key\") pod \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.757317 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ceph\") pod \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.757410 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw9qp\" (UniqueName: \"kubernetes.io/projected/49d2d5f0-5c89-4847-856e-cf9ed17510ec-kube-api-access-dw9qp\") pod \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\" (UID: \"49d2d5f0-5c89-4847-856e-cf9ed17510ec\") " Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.762136 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ceph" (OuterVolumeSpecName: "ceph") pod "49d2d5f0-5c89-4847-856e-cf9ed17510ec" (UID: "49d2d5f0-5c89-4847-856e-cf9ed17510ec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.762313 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d2d5f0-5c89-4847-856e-cf9ed17510ec-kube-api-access-dw9qp" (OuterVolumeSpecName: "kube-api-access-dw9qp") pod "49d2d5f0-5c89-4847-856e-cf9ed17510ec" (UID: "49d2d5f0-5c89-4847-856e-cf9ed17510ec"). InnerVolumeSpecName "kube-api-access-dw9qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.794708 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49d2d5f0-5c89-4847-856e-cf9ed17510ec" (UID: "49d2d5f0-5c89-4847-856e-cf9ed17510ec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.800301 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-inventory" (OuterVolumeSpecName: "inventory") pod "49d2d5f0-5c89-4847-856e-cf9ed17510ec" (UID: "49d2d5f0-5c89-4847-856e-cf9ed17510ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.859690 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.859724 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.859733 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49d2d5f0-5c89-4847-856e-cf9ed17510ec-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:07:59 crc kubenswrapper[4666]: I1203 13:07:59.859741 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw9qp\" (UniqueName: \"kubernetes.io/projected/49d2d5f0-5c89-4847-856e-cf9ed17510ec-kube-api-access-dw9qp\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.311874 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" event={"ID":"49d2d5f0-5c89-4847-856e-cf9ed17510ec","Type":"ContainerDied","Data":"dd7728a40278fec77c6dd34fff4f879bf3a5dd1e6ab4d28d9fc970c5fe53c51a"} Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.311921 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.311922 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7728a40278fec77c6dd34fff4f879bf3a5dd1e6ab4d28d9fc970c5fe53c51a" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.415976 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-npxb5"] Dec 03 13:08:00 crc kubenswrapper[4666]: E1203 13:08:00.416427 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d2d5f0-5c89-4847-856e-cf9ed17510ec" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.416446 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d2d5f0-5c89-4847-856e-cf9ed17510ec" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.416732 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d2d5f0-5c89-4847-856e-cf9ed17510ec" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.417520 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.420905 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.420935 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.421136 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.421227 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.421914 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.428499 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-npxb5"] Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.471686 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xk2r\" (UniqueName: \"kubernetes.io/projected/15e79024-d1c5-4689-900b-92ded975568d-kube-api-access-9xk2r\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.471849 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.471899 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.471930 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ceph\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.573979 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xk2r\" (UniqueName: \"kubernetes.io/projected/15e79024-d1c5-4689-900b-92ded975568d-kube-api-access-9xk2r\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.574296 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.574350 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.574390 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ceph\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.578154 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.578285 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ceph\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.579082 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.593364 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xk2r\" (UniqueName: \"kubernetes.io/projected/15e79024-d1c5-4689-900b-92ded975568d-kube-api-access-9xk2r\") pod \"ssh-known-hosts-edpm-deployment-npxb5\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:00 crc kubenswrapper[4666]: I1203 13:08:00.734611 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:01 crc kubenswrapper[4666]: I1203 13:08:01.231679 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-npxb5"] Dec 03 13:08:01 crc kubenswrapper[4666]: I1203 13:08:01.321164 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" event={"ID":"15e79024-d1c5-4689-900b-92ded975568d","Type":"ContainerStarted","Data":"8db62c4bb2e420cbf89f3afc6f444ef0ec3b02bac53771c6c8e370861a72102f"} Dec 03 13:08:03 crc kubenswrapper[4666]: I1203 13:08:03.339445 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" event={"ID":"15e79024-d1c5-4689-900b-92ded975568d","Type":"ContainerStarted","Data":"bb6986c750add1210634f77b04e91d210f76efb21303b699fb4d4a6b0de5d810"} Dec 03 13:08:03 crc kubenswrapper[4666]: I1203 13:08:03.357614 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" podStartSLOduration=2.637558752 podStartE2EDuration="3.357597822s" podCreationTimestamp="2025-12-03 13:08:00 +0000 UTC" firstStartedPulling="2025-12-03 13:08:01.248177239 +0000 UTC m=+3270.093138290" lastFinishedPulling="2025-12-03 13:08:01.968216309 +0000 UTC m=+3270.813177360" observedRunningTime="2025-12-03 13:08:03.356010059 +0000 UTC m=+3272.200971110" watchObservedRunningTime="2025-12-03 13:08:03.357597822 +0000 UTC m=+3272.202558883" Dec 03 13:08:05 crc kubenswrapper[4666]: I1203 13:08:05.424208 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:08:05 crc kubenswrapper[4666]: E1203 13:08:05.425417 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:08:11 crc kubenswrapper[4666]: I1203 13:08:11.408557 4666 generic.go:334] "Generic (PLEG): container finished" podID="15e79024-d1c5-4689-900b-92ded975568d" containerID="bb6986c750add1210634f77b04e91d210f76efb21303b699fb4d4a6b0de5d810" exitCode=0 Dec 03 13:08:11 crc kubenswrapper[4666]: I1203 13:08:11.408644 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" event={"ID":"15e79024-d1c5-4689-900b-92ded975568d","Type":"ContainerDied","Data":"bb6986c750add1210634f77b04e91d210f76efb21303b699fb4d4a6b0de5d810"} Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.823956 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.900297 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xk2r\" (UniqueName: \"kubernetes.io/projected/15e79024-d1c5-4689-900b-92ded975568d-kube-api-access-9xk2r\") pod \"15e79024-d1c5-4689-900b-92ded975568d\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.900741 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ssh-key-openstack-edpm-ipam\") pod \"15e79024-d1c5-4689-900b-92ded975568d\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.900900 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ceph\") pod \"15e79024-d1c5-4689-900b-92ded975568d\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.900950 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-inventory-0\") pod \"15e79024-d1c5-4689-900b-92ded975568d\" (UID: \"15e79024-d1c5-4689-900b-92ded975568d\") " Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.906872 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ceph" (OuterVolumeSpecName: "ceph") pod "15e79024-d1c5-4689-900b-92ded975568d" (UID: "15e79024-d1c5-4689-900b-92ded975568d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.906911 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e79024-d1c5-4689-900b-92ded975568d-kube-api-access-9xk2r" (OuterVolumeSpecName: "kube-api-access-9xk2r") pod "15e79024-d1c5-4689-900b-92ded975568d" (UID: "15e79024-d1c5-4689-900b-92ded975568d"). InnerVolumeSpecName "kube-api-access-9xk2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.926565 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "15e79024-d1c5-4689-900b-92ded975568d" (UID: "15e79024-d1c5-4689-900b-92ded975568d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:12 crc kubenswrapper[4666]: I1203 13:08:12.927919 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "15e79024-d1c5-4689-900b-92ded975568d" (UID: "15e79024-d1c5-4689-900b-92ded975568d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.004622 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xk2r\" (UniqueName: \"kubernetes.io/projected/15e79024-d1c5-4689-900b-92ded975568d-kube-api-access-9xk2r\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.004670 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.004684 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.004697 4666 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15e79024-d1c5-4689-900b-92ded975568d-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.428504 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.443343 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-npxb5" event={"ID":"15e79024-d1c5-4689-900b-92ded975568d","Type":"ContainerDied","Data":"8db62c4bb2e420cbf89f3afc6f444ef0ec3b02bac53771c6c8e370861a72102f"} Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.443437 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8db62c4bb2e420cbf89f3afc6f444ef0ec3b02bac53771c6c8e370861a72102f" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.507386 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs"] Dec 03 13:08:13 crc kubenswrapper[4666]: E1203 13:08:13.507892 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e79024-d1c5-4689-900b-92ded975568d" containerName="ssh-known-hosts-edpm-deployment" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.507918 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e79024-d1c5-4689-900b-92ded975568d" containerName="ssh-known-hosts-edpm-deployment" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.508261 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e79024-d1c5-4689-900b-92ded975568d" containerName="ssh-known-hosts-edpm-deployment" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.509019 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.514386 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.514580 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.514592 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.514710 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.516145 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.527426 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs"] Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.615608 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.615707 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.615868 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.616747 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8v5v\" (UniqueName: \"kubernetes.io/projected/434bedfb-c0c1-45d4-ade6-8e5112122e58-kube-api-access-x8v5v\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.720216 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8v5v\" (UniqueName: \"kubernetes.io/projected/434bedfb-c0c1-45d4-ade6-8e5112122e58-kube-api-access-x8v5v\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.720317 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.720420 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.720535 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.725132 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.726382 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.726578 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.740036 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8v5v\" (UniqueName: \"kubernetes.io/projected/434bedfb-c0c1-45d4-ade6-8e5112122e58-kube-api-access-x8v5v\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flnhs\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.826820 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.988496 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s87hn"] Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.990526 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:13 crc kubenswrapper[4666]: I1203 13:08:13.997760 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s87hn"] Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.180919 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c77xx\" (UniqueName: \"kubernetes.io/projected/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-kube-api-access-c77xx\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.181320 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-utilities\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.181382 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-catalog-content\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.283293 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c77xx\" (UniqueName: \"kubernetes.io/projected/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-kube-api-access-c77xx\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.283486 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-utilities\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.283518 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-catalog-content\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.284332 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-catalog-content\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.284371 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-utilities\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.315421 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c77xx\" (UniqueName: \"kubernetes.io/projected/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-kube-api-access-c77xx\") pod \"redhat-marketplace-s87hn\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.438413 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs"] Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.443297 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" event={"ID":"434bedfb-c0c1-45d4-ade6-8e5112122e58","Type":"ContainerStarted","Data":"f76c97d1ed6e8af112eb687886b7eb663afc494d617c1bd5aa6a017cd247bfed"} Dec 03 13:08:14 crc kubenswrapper[4666]: I1203 13:08:14.616424 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:15 crc kubenswrapper[4666]: W1203 13:08:15.091778 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c11a4a_0034_4480_882e_4bb7d31dc5a9.slice/crio-18e119fa8a1a57e24d70e7e405d8b95e51285eb03710fa1be8e38d7ac60bb28b WatchSource:0}: Error finding container 18e119fa8a1a57e24d70e7e405d8b95e51285eb03710fa1be8e38d7ac60bb28b: Status 404 returned error can't find the container with id 18e119fa8a1a57e24d70e7e405d8b95e51285eb03710fa1be8e38d7ac60bb28b Dec 03 13:08:15 crc kubenswrapper[4666]: I1203 13:08:15.094835 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s87hn"] Dec 03 13:08:15 crc kubenswrapper[4666]: I1203 13:08:15.452702 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" event={"ID":"434bedfb-c0c1-45d4-ade6-8e5112122e58","Type":"ContainerStarted","Data":"200be7a50f966aed22c8032940e02cf18f1e9bd11ce1b6a4b924cca67cc71485"} Dec 03 13:08:15 crc kubenswrapper[4666]: I1203 13:08:15.455662 4666 generic.go:334] "Generic (PLEG): container finished" podID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerID="5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd" exitCode=0 Dec 03 13:08:15 crc kubenswrapper[4666]: I1203 13:08:15.455712 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s87hn" event={"ID":"a1c11a4a-0034-4480-882e-4bb7d31dc5a9","Type":"ContainerDied","Data":"5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd"} Dec 03 13:08:15 crc kubenswrapper[4666]: I1203 13:08:15.455742 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s87hn" event={"ID":"a1c11a4a-0034-4480-882e-4bb7d31dc5a9","Type":"ContainerStarted","Data":"18e119fa8a1a57e24d70e7e405d8b95e51285eb03710fa1be8e38d7ac60bb28b"} Dec 03 13:08:15 crc kubenswrapper[4666]: I1203 13:08:15.475343 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" podStartSLOduration=1.9426504310000001 podStartE2EDuration="2.475321826s" podCreationTimestamp="2025-12-03 13:08:13 +0000 UTC" firstStartedPulling="2025-12-03 13:08:14.435367742 +0000 UTC m=+3283.280328793" lastFinishedPulling="2025-12-03 13:08:14.968039137 +0000 UTC m=+3283.813000188" observedRunningTime="2025-12-03 13:08:15.4673202 +0000 UTC m=+3284.312281271" watchObservedRunningTime="2025-12-03 13:08:15.475321826 +0000 UTC m=+3284.320282877" Dec 03 13:08:16 crc kubenswrapper[4666]: I1203 13:08:16.468379 4666 generic.go:334] "Generic (PLEG): container finished" podID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerID="0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3" exitCode=0 Dec 03 13:08:16 crc kubenswrapper[4666]: I1203 13:08:16.468622 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s87hn" event={"ID":"a1c11a4a-0034-4480-882e-4bb7d31dc5a9","Type":"ContainerDied","Data":"0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3"} Dec 03 13:08:17 crc kubenswrapper[4666]: I1203 13:08:17.479954 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s87hn" event={"ID":"a1c11a4a-0034-4480-882e-4bb7d31dc5a9","Type":"ContainerStarted","Data":"a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83"} Dec 03 13:08:17 crc kubenswrapper[4666]: I1203 13:08:17.504957 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s87hn" podStartSLOduration=3.100312032 podStartE2EDuration="4.504936565s" podCreationTimestamp="2025-12-03 13:08:13 +0000 UTC" firstStartedPulling="2025-12-03 13:08:15.45731654 +0000 UTC m=+3284.302277591" lastFinishedPulling="2025-12-03 13:08:16.861941073 +0000 UTC m=+3285.706902124" observedRunningTime="2025-12-03 13:08:17.500338121 +0000 UTC m=+3286.345299192" watchObservedRunningTime="2025-12-03 13:08:17.504936565 +0000 UTC m=+3286.349897616" Dec 03 13:08:20 crc kubenswrapper[4666]: I1203 13:08:20.423505 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:08:20 crc kubenswrapper[4666]: E1203 13:08:20.424162 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:08:24 crc kubenswrapper[4666]: I1203 13:08:23.532124 4666 generic.go:334] "Generic (PLEG): container finished" podID="434bedfb-c0c1-45d4-ade6-8e5112122e58" containerID="200be7a50f966aed22c8032940e02cf18f1e9bd11ce1b6a4b924cca67cc71485" exitCode=0 Dec 03 13:08:24 crc kubenswrapper[4666]: I1203 13:08:23.532691 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" event={"ID":"434bedfb-c0c1-45d4-ade6-8e5112122e58","Type":"ContainerDied","Data":"200be7a50f966aed22c8032940e02cf18f1e9bd11ce1b6a4b924cca67cc71485"} Dec 03 13:08:24 crc kubenswrapper[4666]: I1203 13:08:24.617523 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:24 crc kubenswrapper[4666]: I1203 13:08:24.617575 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:24 crc kubenswrapper[4666]: I1203 13:08:24.664823 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:24 crc kubenswrapper[4666]: I1203 13:08:24.934373 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.085345 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-inventory\") pod \"434bedfb-c0c1-45d4-ade6-8e5112122e58\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.085465 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8v5v\" (UniqueName: \"kubernetes.io/projected/434bedfb-c0c1-45d4-ade6-8e5112122e58-kube-api-access-x8v5v\") pod \"434bedfb-c0c1-45d4-ade6-8e5112122e58\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.085677 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ceph\") pod \"434bedfb-c0c1-45d4-ade6-8e5112122e58\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.085868 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ssh-key\") pod \"434bedfb-c0c1-45d4-ade6-8e5112122e58\" (UID: \"434bedfb-c0c1-45d4-ade6-8e5112122e58\") " Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.093693 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434bedfb-c0c1-45d4-ade6-8e5112122e58-kube-api-access-x8v5v" (OuterVolumeSpecName: "kube-api-access-x8v5v") pod "434bedfb-c0c1-45d4-ade6-8e5112122e58" (UID: "434bedfb-c0c1-45d4-ade6-8e5112122e58"). InnerVolumeSpecName "kube-api-access-x8v5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.094256 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ceph" (OuterVolumeSpecName: "ceph") pod "434bedfb-c0c1-45d4-ade6-8e5112122e58" (UID: "434bedfb-c0c1-45d4-ade6-8e5112122e58"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.116788 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "434bedfb-c0c1-45d4-ade6-8e5112122e58" (UID: "434bedfb-c0c1-45d4-ade6-8e5112122e58"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.157331 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-inventory" (OuterVolumeSpecName: "inventory") pod "434bedfb-c0c1-45d4-ade6-8e5112122e58" (UID: "434bedfb-c0c1-45d4-ade6-8e5112122e58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.190602 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8v5v\" (UniqueName: \"kubernetes.io/projected/434bedfb-c0c1-45d4-ade6-8e5112122e58-kube-api-access-x8v5v\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.190654 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.190667 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.190678 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/434bedfb-c0c1-45d4-ade6-8e5112122e58-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.554606 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" event={"ID":"434bedfb-c0c1-45d4-ade6-8e5112122e58","Type":"ContainerDied","Data":"f76c97d1ed6e8af112eb687886b7eb663afc494d617c1bd5aa6a017cd247bfed"} Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.554740 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f76c97d1ed6e8af112eb687886b7eb663afc494d617c1bd5aa6a017cd247bfed" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.555168 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flnhs" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.629061 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.642202 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd"] Dec 03 13:08:25 crc kubenswrapper[4666]: E1203 13:08:25.642699 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434bedfb-c0c1-45d4-ade6-8e5112122e58" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.642721 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="434bedfb-c0c1-45d4-ade6-8e5112122e58" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.642956 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="434bedfb-c0c1-45d4-ade6-8e5112122e58" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.643776 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.646942 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.647397 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.647622 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.647786 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.647914 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.665256 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd"] Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.692884 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s87hn"] Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.803563 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.803627 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.803978 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.804136 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrpv\" (UniqueName: \"kubernetes.io/projected/a854a0c8-aad2-4681-9077-c8abd034fa73-kube-api-access-msrpv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.906220 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.906676 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrpv\" (UniqueName: \"kubernetes.io/projected/a854a0c8-aad2-4681-9077-c8abd034fa73-kube-api-access-msrpv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.906779 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.906839 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.911699 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.912192 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.912799 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.924931 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrpv\" (UniqueName: \"kubernetes.io/projected/a854a0c8-aad2-4681-9077-c8abd034fa73-kube-api-access-msrpv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:25 crc kubenswrapper[4666]: I1203 13:08:25.966521 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:26 crc kubenswrapper[4666]: I1203 13:08:26.474523 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd"] Dec 03 13:08:26 crc kubenswrapper[4666]: I1203 13:08:26.565586 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" event={"ID":"a854a0c8-aad2-4681-9077-c8abd034fa73","Type":"ContainerStarted","Data":"618ec5d31aa2c7ccaa19167686be2441dcf80716d5f30e0b91e89f4795c726f5"} Dec 03 13:08:27 crc kubenswrapper[4666]: I1203 13:08:27.574979 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s87hn" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="registry-server" containerID="cri-o://a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83" gracePeriod=2 Dec 03 13:08:27 crc kubenswrapper[4666]: I1203 13:08:27.979666 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.148020 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-catalog-content\") pod \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.148232 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-utilities\") pod \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.148341 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c77xx\" (UniqueName: \"kubernetes.io/projected/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-kube-api-access-c77xx\") pod \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\" (UID: \"a1c11a4a-0034-4480-882e-4bb7d31dc5a9\") " Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.148921 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-utilities" (OuterVolumeSpecName: "utilities") pod "a1c11a4a-0034-4480-882e-4bb7d31dc5a9" (UID: "a1c11a4a-0034-4480-882e-4bb7d31dc5a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.156335 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-kube-api-access-c77xx" (OuterVolumeSpecName: "kube-api-access-c77xx") pod "a1c11a4a-0034-4480-882e-4bb7d31dc5a9" (UID: "a1c11a4a-0034-4480-882e-4bb7d31dc5a9"). InnerVolumeSpecName "kube-api-access-c77xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.171138 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1c11a4a-0034-4480-882e-4bb7d31dc5a9" (UID: "a1c11a4a-0034-4480-882e-4bb7d31dc5a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.250485 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c77xx\" (UniqueName: \"kubernetes.io/projected/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-kube-api-access-c77xx\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.250518 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.250531 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c11a4a-0034-4480-882e-4bb7d31dc5a9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.586382 4666 generic.go:334] "Generic (PLEG): container finished" podID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerID="a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83" exitCode=0 Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.586462 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s87hn" event={"ID":"a1c11a4a-0034-4480-882e-4bb7d31dc5a9","Type":"ContainerDied","Data":"a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83"} Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.586520 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s87hn" event={"ID":"a1c11a4a-0034-4480-882e-4bb7d31dc5a9","Type":"ContainerDied","Data":"18e119fa8a1a57e24d70e7e405d8b95e51285eb03710fa1be8e38d7ac60bb28b"} Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.586543 4666 scope.go:117] "RemoveContainer" containerID="a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.587144 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s87hn" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.587712 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" event={"ID":"a854a0c8-aad2-4681-9077-c8abd034fa73","Type":"ContainerStarted","Data":"0ec22d52d820ccd91ff2e17a4640c0ddb6ddffb40f58de509f050ae5e55045bc"} Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.611602 4666 scope.go:117] "RemoveContainer" containerID="0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.620443 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" podStartSLOduration=2.9569560089999998 podStartE2EDuration="3.620418243s" podCreationTimestamp="2025-12-03 13:08:25 +0000 UTC" firstStartedPulling="2025-12-03 13:08:26.479571162 +0000 UTC m=+3295.324532213" lastFinishedPulling="2025-12-03 13:08:27.143033396 +0000 UTC m=+3295.987994447" observedRunningTime="2025-12-03 13:08:28.615673795 +0000 UTC m=+3297.460634846" watchObservedRunningTime="2025-12-03 13:08:28.620418243 +0000 UTC m=+3297.465379304" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.645833 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s87hn"] Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.656501 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s87hn"] Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.657360 4666 scope.go:117] "RemoveContainer" containerID="5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.687195 4666 scope.go:117] "RemoveContainer" containerID="a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83" Dec 03 13:08:28 crc kubenswrapper[4666]: E1203 13:08:28.687814 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83\": container with ID starting with a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83 not found: ID does not exist" containerID="a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.687886 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83"} err="failed to get container status \"a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83\": rpc error: code = NotFound desc = could not find container \"a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83\": container with ID starting with a042e7a3d204b19be9ebdc59b146c126d6e9821a52c4440b4077b253f075db83 not found: ID does not exist" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.687918 4666 scope.go:117] "RemoveContainer" containerID="0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3" Dec 03 13:08:28 crc kubenswrapper[4666]: E1203 13:08:28.688337 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3\": container with ID starting with 0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3 not found: ID does not exist" containerID="0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.688366 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3"} err="failed to get container status \"0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3\": rpc error: code = NotFound desc = could not find container \"0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3\": container with ID starting with 0fdfb551684e013722d5f3ae3b4801bda8ef003cac34c43cf3267bc4e923c8d3 not found: ID does not exist" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.688382 4666 scope.go:117] "RemoveContainer" containerID="5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd" Dec 03 13:08:28 crc kubenswrapper[4666]: E1203 13:08:28.688914 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd\": container with ID starting with 5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd not found: ID does not exist" containerID="5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd" Dec 03 13:08:28 crc kubenswrapper[4666]: I1203 13:08:28.688935 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd"} err="failed to get container status \"5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd\": rpc error: code = NotFound desc = could not find container \"5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd\": container with ID starting with 5cd85998523e1132e564f3b7c78bf7fed315426e643927a495a3f03311f904bd not found: ID does not exist" Dec 03 13:08:29 crc kubenswrapper[4666]: I1203 13:08:29.434953 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" path="/var/lib/kubelet/pods/a1c11a4a-0034-4480-882e-4bb7d31dc5a9/volumes" Dec 03 13:08:35 crc kubenswrapper[4666]: I1203 13:08:35.423914 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:08:35 crc kubenswrapper[4666]: E1203 13:08:35.424644 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:08:37 crc kubenswrapper[4666]: I1203 13:08:37.674079 4666 generic.go:334] "Generic (PLEG): container finished" podID="a854a0c8-aad2-4681-9077-c8abd034fa73" containerID="0ec22d52d820ccd91ff2e17a4640c0ddb6ddffb40f58de509f050ae5e55045bc" exitCode=0 Dec 03 13:08:37 crc kubenswrapper[4666]: I1203 13:08:37.674129 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" event={"ID":"a854a0c8-aad2-4681-9077-c8abd034fa73","Type":"ContainerDied","Data":"0ec22d52d820ccd91ff2e17a4640c0ddb6ddffb40f58de509f050ae5e55045bc"} Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.139441 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.157006 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msrpv\" (UniqueName: \"kubernetes.io/projected/a854a0c8-aad2-4681-9077-c8abd034fa73-kube-api-access-msrpv\") pod \"a854a0c8-aad2-4681-9077-c8abd034fa73\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.157152 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ssh-key\") pod \"a854a0c8-aad2-4681-9077-c8abd034fa73\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.157199 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ceph\") pod \"a854a0c8-aad2-4681-9077-c8abd034fa73\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.157400 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-inventory\") pod \"a854a0c8-aad2-4681-9077-c8abd034fa73\" (UID: \"a854a0c8-aad2-4681-9077-c8abd034fa73\") " Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.164022 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ceph" (OuterVolumeSpecName: "ceph") pod "a854a0c8-aad2-4681-9077-c8abd034fa73" (UID: "a854a0c8-aad2-4681-9077-c8abd034fa73"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.166326 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a854a0c8-aad2-4681-9077-c8abd034fa73-kube-api-access-msrpv" (OuterVolumeSpecName: "kube-api-access-msrpv") pod "a854a0c8-aad2-4681-9077-c8abd034fa73" (UID: "a854a0c8-aad2-4681-9077-c8abd034fa73"). InnerVolumeSpecName "kube-api-access-msrpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.185588 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-inventory" (OuterVolumeSpecName: "inventory") pod "a854a0c8-aad2-4681-9077-c8abd034fa73" (UID: "a854a0c8-aad2-4681-9077-c8abd034fa73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.190285 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a854a0c8-aad2-4681-9077-c8abd034fa73" (UID: "a854a0c8-aad2-4681-9077-c8abd034fa73"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.260550 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.260586 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msrpv\" (UniqueName: \"kubernetes.io/projected/a854a0c8-aad2-4681-9077-c8abd034fa73-kube-api-access-msrpv\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.260597 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.260605 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a854a0c8-aad2-4681-9077-c8abd034fa73-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.690680 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" event={"ID":"a854a0c8-aad2-4681-9077-c8abd034fa73","Type":"ContainerDied","Data":"618ec5d31aa2c7ccaa19167686be2441dcf80716d5f30e0b91e89f4795c726f5"} Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.690717 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="618ec5d31aa2c7ccaa19167686be2441dcf80716d5f30e0b91e89f4795c726f5" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.690727 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.770680 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8"] Dec 03 13:08:39 crc kubenswrapper[4666]: E1203 13:08:39.771331 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a854a0c8-aad2-4681-9077-c8abd034fa73" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.771444 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="a854a0c8-aad2-4681-9077-c8abd034fa73" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:39 crc kubenswrapper[4666]: E1203 13:08:39.771512 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="extract-utilities" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.771601 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="extract-utilities" Dec 03 13:08:39 crc kubenswrapper[4666]: E1203 13:08:39.771697 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="extract-content" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.772994 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="extract-content" Dec 03 13:08:39 crc kubenswrapper[4666]: E1203 13:08:39.773127 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="registry-server" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.773193 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="registry-server" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.773466 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c11a4a-0034-4480-882e-4bb7d31dc5a9" containerName="registry-server" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.773524 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="a854a0c8-aad2-4681-9077-c8abd034fa73" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.774185 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.776276 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.778046 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.778406 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.779505 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.779713 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.779838 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.779980 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.780525 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.781262 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8"] Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.871558 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.871608 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.871638 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872014 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872047 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdp45\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-kube-api-access-tdp45\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872135 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872158 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872215 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872418 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872490 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872537 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872710 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.872746 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974293 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974347 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdp45\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-kube-api-access-tdp45\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974370 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974388 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974409 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974449 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974470 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974492 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974538 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974556 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974593 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974614 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.974632 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.979533 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.981709 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.982073 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.982505 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.982769 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.982774 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.983075 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.983269 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.983569 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.983847 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.984794 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.990321 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:39 crc kubenswrapper[4666]: I1203 13:08:39.991787 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdp45\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-kube-api-access-tdp45\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:40 crc kubenswrapper[4666]: I1203 13:08:40.088797 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:08:40 crc kubenswrapper[4666]: I1203 13:08:40.631552 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8"] Dec 03 13:08:40 crc kubenswrapper[4666]: I1203 13:08:40.700724 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" event={"ID":"7a99d58b-e139-49f4-8689-faeb388b82ff","Type":"ContainerStarted","Data":"1a79813cc5242c18af94bebc7baad59974c1cd8549b83250fbd4857dacf6fa0e"} Dec 03 13:08:41 crc kubenswrapper[4666]: I1203 13:08:41.713252 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" event={"ID":"7a99d58b-e139-49f4-8689-faeb388b82ff","Type":"ContainerStarted","Data":"f96e903007a000365a83937f879268516981106001e9142df6805a9ff042b670"} Dec 03 13:08:41 crc kubenswrapper[4666]: I1203 13:08:41.732472 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" podStartSLOduration=1.938921763 podStartE2EDuration="2.732450286s" podCreationTimestamp="2025-12-03 13:08:39 +0000 UTC" firstStartedPulling="2025-12-03 13:08:40.634244422 +0000 UTC m=+3309.479205473" lastFinishedPulling="2025-12-03 13:08:41.427772945 +0000 UTC m=+3310.272733996" observedRunningTime="2025-12-03 13:08:41.732359774 +0000 UTC m=+3310.577320845" watchObservedRunningTime="2025-12-03 13:08:41.732450286 +0000 UTC m=+3310.577411327" Dec 03 13:08:48 crc kubenswrapper[4666]: I1203 13:08:48.424044 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:08:48 crc kubenswrapper[4666]: E1203 13:08:48.425778 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:09:03 crc kubenswrapper[4666]: I1203 13:09:03.424227 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:09:03 crc kubenswrapper[4666]: E1203 13:09:03.425019 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:09:10 crc kubenswrapper[4666]: I1203 13:09:10.957070 4666 generic.go:334] "Generic (PLEG): container finished" podID="7a99d58b-e139-49f4-8689-faeb388b82ff" containerID="f96e903007a000365a83937f879268516981106001e9142df6805a9ff042b670" exitCode=0 Dec 03 13:09:10 crc kubenswrapper[4666]: I1203 13:09:10.957331 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" event={"ID":"7a99d58b-e139-49f4-8689-faeb388b82ff","Type":"ContainerDied","Data":"f96e903007a000365a83937f879268516981106001e9142df6805a9ff042b670"} Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.362281 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.405873 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ovn-combined-ca-bundle\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.405934 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.406039 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.406078 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-neutron-metadata-combined-ca-bundle\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.406214 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-libvirt-combined-ca-bundle\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.406913 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-bootstrap-combined-ca-bundle\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.406968 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ssh-key\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.407014 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-inventory\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.407044 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ceph\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.407115 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-repo-setup-combined-ca-bundle\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.407292 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdp45\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-kube-api-access-tdp45\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.407324 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-nova-combined-ca-bundle\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.407387 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7a99d58b-e139-49f4-8689-faeb388b82ff\" (UID: \"7a99d58b-e139-49f4-8689-faeb388b82ff\") " Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.412862 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.413118 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.413432 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.413542 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.414668 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.415664 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.416744 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.416783 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-kube-api-access-tdp45" (OuterVolumeSpecName: "kube-api-access-tdp45") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "kube-api-access-tdp45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.420125 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.421336 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ceph" (OuterVolumeSpecName: "ceph") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.422328 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.442658 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.443331 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-inventory" (OuterVolumeSpecName: "inventory") pod "7a99d58b-e139-49f4-8689-faeb388b82ff" (UID: "7a99d58b-e139-49f4-8689-faeb388b82ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509263 4666 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509295 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509308 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509319 4666 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509328 4666 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509338 4666 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509347 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509354 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509362 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509371 4666 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509402 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdp45\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-kube-api-access-tdp45\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509411 4666 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a99d58b-e139-49f4-8689-faeb388b82ff-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.509420 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7a99d58b-e139-49f4-8689-faeb388b82ff-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.975423 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" event={"ID":"7a99d58b-e139-49f4-8689-faeb388b82ff","Type":"ContainerDied","Data":"1a79813cc5242c18af94bebc7baad59974c1cd8549b83250fbd4857dacf6fa0e"} Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.975482 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a79813cc5242c18af94bebc7baad59974c1cd8549b83250fbd4857dacf6fa0e" Dec 03 13:09:12 crc kubenswrapper[4666]: I1203 13:09:12.975479 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.066620 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7"] Dec 03 13:09:13 crc kubenswrapper[4666]: E1203 13:09:13.067165 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a99d58b-e139-49f4-8689-faeb388b82ff" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.067189 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a99d58b-e139-49f4-8689-faeb388b82ff" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.067400 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a99d58b-e139-49f4-8689-faeb388b82ff" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.068579 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.070684 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.071583 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.071662 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.073616 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.074732 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.075241 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7"] Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.119697 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.119856 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.119930 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.119959 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmt8k\" (UniqueName: \"kubernetes.io/projected/0d695826-87fe-4625-9925-988306a9e16b-kube-api-access-hmt8k\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.221356 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.221759 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.221799 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmt8k\" (UniqueName: \"kubernetes.io/projected/0d695826-87fe-4625-9925-988306a9e16b-kube-api-access-hmt8k\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.221894 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.225781 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.225835 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.232603 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.241499 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmt8k\" (UniqueName: \"kubernetes.io/projected/0d695826-87fe-4625-9925-988306a9e16b-kube-api-access-hmt8k\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-29td7\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.427701 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.905434 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7"] Dec 03 13:09:13 crc kubenswrapper[4666]: I1203 13:09:13.983131 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" event={"ID":"0d695826-87fe-4625-9925-988306a9e16b","Type":"ContainerStarted","Data":"7381269531d15539aace970d0d81d5724d46ec591a33a89adfac791f96ac5410"} Dec 03 13:09:16 crc kubenswrapper[4666]: I1203 13:09:16.003038 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" event={"ID":"0d695826-87fe-4625-9925-988306a9e16b","Type":"ContainerStarted","Data":"b1e392b361b006692bf05474a7be6ee0e8ccfeed887d3b3a062ecf842f9d6523"} Dec 03 13:09:16 crc kubenswrapper[4666]: I1203 13:09:16.022524 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" podStartSLOduration=2.20158476 podStartE2EDuration="3.022506373s" podCreationTimestamp="2025-12-03 13:09:13 +0000 UTC" firstStartedPulling="2025-12-03 13:09:13.909813733 +0000 UTC m=+3342.754774784" lastFinishedPulling="2025-12-03 13:09:14.730735306 +0000 UTC m=+3343.575696397" observedRunningTime="2025-12-03 13:09:16.020603082 +0000 UTC m=+3344.865564143" watchObservedRunningTime="2025-12-03 13:09:16.022506373 +0000 UTC m=+3344.867467424" Dec 03 13:09:18 crc kubenswrapper[4666]: I1203 13:09:18.423357 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:09:18 crc kubenswrapper[4666]: E1203 13:09:18.423963 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:09:20 crc kubenswrapper[4666]: I1203 13:09:20.068833 4666 generic.go:334] "Generic (PLEG): container finished" podID="0d695826-87fe-4625-9925-988306a9e16b" containerID="b1e392b361b006692bf05474a7be6ee0e8ccfeed887d3b3a062ecf842f9d6523" exitCode=0 Dec 03 13:09:20 crc kubenswrapper[4666]: I1203 13:09:20.068924 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" event={"ID":"0d695826-87fe-4625-9925-988306a9e16b","Type":"ContainerDied","Data":"b1e392b361b006692bf05474a7be6ee0e8ccfeed887d3b3a062ecf842f9d6523"} Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.490126 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.689779 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmt8k\" (UniqueName: \"kubernetes.io/projected/0d695826-87fe-4625-9925-988306a9e16b-kube-api-access-hmt8k\") pod \"0d695826-87fe-4625-9925-988306a9e16b\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.689881 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ssh-key\") pod \"0d695826-87fe-4625-9925-988306a9e16b\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.689997 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ceph\") pod \"0d695826-87fe-4625-9925-988306a9e16b\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.690029 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-inventory\") pod \"0d695826-87fe-4625-9925-988306a9e16b\" (UID: \"0d695826-87fe-4625-9925-988306a9e16b\") " Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.694932 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ceph" (OuterVolumeSpecName: "ceph") pod "0d695826-87fe-4625-9925-988306a9e16b" (UID: "0d695826-87fe-4625-9925-988306a9e16b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.701243 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d695826-87fe-4625-9925-988306a9e16b-kube-api-access-hmt8k" (OuterVolumeSpecName: "kube-api-access-hmt8k") pod "0d695826-87fe-4625-9925-988306a9e16b" (UID: "0d695826-87fe-4625-9925-988306a9e16b"). InnerVolumeSpecName "kube-api-access-hmt8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.722510 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d695826-87fe-4625-9925-988306a9e16b" (UID: "0d695826-87fe-4625-9925-988306a9e16b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.725395 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-inventory" (OuterVolumeSpecName: "inventory") pod "0d695826-87fe-4625-9925-988306a9e16b" (UID: "0d695826-87fe-4625-9925-988306a9e16b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.792596 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.792637 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.792660 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmt8k\" (UniqueName: \"kubernetes.io/projected/0d695826-87fe-4625-9925-988306a9e16b-kube-api-access-hmt8k\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:21 crc kubenswrapper[4666]: I1203 13:09:21.792683 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d695826-87fe-4625-9925-988306a9e16b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.087541 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" event={"ID":"0d695826-87fe-4625-9925-988306a9e16b","Type":"ContainerDied","Data":"7381269531d15539aace970d0d81d5724d46ec591a33a89adfac791f96ac5410"} Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.087582 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7381269531d15539aace970d0d81d5724d46ec591a33a89adfac791f96ac5410" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.087617 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-29td7" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.170115 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q"] Dec 03 13:09:22 crc kubenswrapper[4666]: E1203 13:09:22.170577 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d695826-87fe-4625-9925-988306a9e16b" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.170601 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d695826-87fe-4625-9925-988306a9e16b" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.170823 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d695826-87fe-4625-9925-988306a9e16b" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.171474 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.173310 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.173810 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.173819 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.173899 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.174499 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.174769 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.181036 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q"] Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.300077 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f8dd079-749d-4ad3-8365-eb026d693512-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.300152 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.300224 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5k7d\" (UniqueName: \"kubernetes.io/projected/1f8dd079-749d-4ad3-8365-eb026d693512-kube-api-access-w5k7d\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.300248 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.300909 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.300945 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.402865 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f8dd079-749d-4ad3-8365-eb026d693512-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.402936 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.403032 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5k7d\" (UniqueName: \"kubernetes.io/projected/1f8dd079-749d-4ad3-8365-eb026d693512-kube-api-access-w5k7d\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.403062 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.403131 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.403167 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.404210 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f8dd079-749d-4ad3-8365-eb026d693512-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.407767 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.408242 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.409698 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.411036 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.424579 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5k7d\" (UniqueName: \"kubernetes.io/projected/1f8dd079-749d-4ad3-8365-eb026d693512-kube-api-access-w5k7d\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8bt2q\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.489881 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:09:22 crc kubenswrapper[4666]: I1203 13:09:22.993461 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q"] Dec 03 13:09:23 crc kubenswrapper[4666]: I1203 13:09:23.098102 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" event={"ID":"1f8dd079-749d-4ad3-8365-eb026d693512","Type":"ContainerStarted","Data":"1e774c9b8713fa8265176d23bf79df492a582c3ef8102e70db4a85ddea68b58c"} Dec 03 13:09:24 crc kubenswrapper[4666]: I1203 13:09:24.107047 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" event={"ID":"1f8dd079-749d-4ad3-8365-eb026d693512","Type":"ContainerStarted","Data":"a56ca45d365cb295adc0b131c4843e783347aeee857540f84856a345856a27d7"} Dec 03 13:09:24 crc kubenswrapper[4666]: I1203 13:09:24.126544 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" podStartSLOduration=1.620598027 podStartE2EDuration="2.126522199s" podCreationTimestamp="2025-12-03 13:09:22 +0000 UTC" firstStartedPulling="2025-12-03 13:09:22.998270253 +0000 UTC m=+3351.843231304" lastFinishedPulling="2025-12-03 13:09:23.504194425 +0000 UTC m=+3352.349155476" observedRunningTime="2025-12-03 13:09:24.123057285 +0000 UTC m=+3352.968018356" watchObservedRunningTime="2025-12-03 13:09:24.126522199 +0000 UTC m=+3352.971483260" Dec 03 13:09:32 crc kubenswrapper[4666]: I1203 13:09:32.423994 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:09:32 crc kubenswrapper[4666]: E1203 13:09:32.424930 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:09:44 crc kubenswrapper[4666]: I1203 13:09:44.424535 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:09:44 crc kubenswrapper[4666]: E1203 13:09:44.425286 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:09:55 crc kubenswrapper[4666]: I1203 13:09:55.423613 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:09:55 crc kubenswrapper[4666]: E1203 13:09:55.424428 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:10:08 crc kubenswrapper[4666]: I1203 13:10:08.423790 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:10:08 crc kubenswrapper[4666]: E1203 13:10:08.424418 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.351333 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-flb4n"] Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.354612 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.374840 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flb4n"] Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.407811 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-catalog-content\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.408232 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-utilities\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.408596 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khj8b\" (UniqueName: \"kubernetes.io/projected/16dda495-9a3a-48f3-a213-65422177fc9c-kube-api-access-khj8b\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.510015 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khj8b\" (UniqueName: \"kubernetes.io/projected/16dda495-9a3a-48f3-a213-65422177fc9c-kube-api-access-khj8b\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.510487 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-catalog-content\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.510982 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-catalog-content\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.511032 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-utilities\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.511421 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-utilities\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.535630 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khj8b\" (UniqueName: \"kubernetes.io/projected/16dda495-9a3a-48f3-a213-65422177fc9c-kube-api-access-khj8b\") pod \"community-operators-flb4n\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:20 crc kubenswrapper[4666]: I1203 13:10:20.679016 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:21 crc kubenswrapper[4666]: I1203 13:10:21.179153 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flb4n"] Dec 03 13:10:21 crc kubenswrapper[4666]: I1203 13:10:21.595550 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flb4n" event={"ID":"16dda495-9a3a-48f3-a213-65422177fc9c","Type":"ContainerStarted","Data":"075eb4d94aa3ae332702ccb19f827c71373c604d90770fb052f0ddbba509f1f0"} Dec 03 13:10:22 crc kubenswrapper[4666]: I1203 13:10:22.423792 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:10:22 crc kubenswrapper[4666]: E1203 13:10:22.424141 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:10:22 crc kubenswrapper[4666]: I1203 13:10:22.607935 4666 generic.go:334] "Generic (PLEG): container finished" podID="16dda495-9a3a-48f3-a213-65422177fc9c" containerID="ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4" exitCode=0 Dec 03 13:10:22 crc kubenswrapper[4666]: I1203 13:10:22.608123 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flb4n" event={"ID":"16dda495-9a3a-48f3-a213-65422177fc9c","Type":"ContainerDied","Data":"ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4"} Dec 03 13:10:23 crc kubenswrapper[4666]: I1203 13:10:23.620299 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flb4n" event={"ID":"16dda495-9a3a-48f3-a213-65422177fc9c","Type":"ContainerStarted","Data":"f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741"} Dec 03 13:10:24 crc kubenswrapper[4666]: I1203 13:10:24.631825 4666 generic.go:334] "Generic (PLEG): container finished" podID="16dda495-9a3a-48f3-a213-65422177fc9c" containerID="f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741" exitCode=0 Dec 03 13:10:24 crc kubenswrapper[4666]: I1203 13:10:24.631941 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flb4n" event={"ID":"16dda495-9a3a-48f3-a213-65422177fc9c","Type":"ContainerDied","Data":"f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741"} Dec 03 13:10:25 crc kubenswrapper[4666]: I1203 13:10:25.647710 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flb4n" event={"ID":"16dda495-9a3a-48f3-a213-65422177fc9c","Type":"ContainerStarted","Data":"eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc"} Dec 03 13:10:25 crc kubenswrapper[4666]: I1203 13:10:25.671427 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-flb4n" podStartSLOduration=3.212308893 podStartE2EDuration="5.671400237s" podCreationTimestamp="2025-12-03 13:10:20 +0000 UTC" firstStartedPulling="2025-12-03 13:10:22.613128028 +0000 UTC m=+3411.458089079" lastFinishedPulling="2025-12-03 13:10:25.072219372 +0000 UTC m=+3413.917180423" observedRunningTime="2025-12-03 13:10:25.668857398 +0000 UTC m=+3414.513818459" watchObservedRunningTime="2025-12-03 13:10:25.671400237 +0000 UTC m=+3414.516361288" Dec 03 13:10:30 crc kubenswrapper[4666]: I1203 13:10:30.679966 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:30 crc kubenswrapper[4666]: I1203 13:10:30.680509 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:30 crc kubenswrapper[4666]: I1203 13:10:30.692585 4666 generic.go:334] "Generic (PLEG): container finished" podID="1f8dd079-749d-4ad3-8365-eb026d693512" containerID="a56ca45d365cb295adc0b131c4843e783347aeee857540f84856a345856a27d7" exitCode=0 Dec 03 13:10:30 crc kubenswrapper[4666]: I1203 13:10:30.692641 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" event={"ID":"1f8dd079-749d-4ad3-8365-eb026d693512","Type":"ContainerDied","Data":"a56ca45d365cb295adc0b131c4843e783347aeee857540f84856a345856a27d7"} Dec 03 13:10:30 crc kubenswrapper[4666]: I1203 13:10:30.727149 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:30 crc kubenswrapper[4666]: I1203 13:10:30.774239 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:30 crc kubenswrapper[4666]: I1203 13:10:30.961868 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flb4n"] Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.115283 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.149454 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ceph\") pod \"1f8dd079-749d-4ad3-8365-eb026d693512\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.149562 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ssh-key\") pod \"1f8dd079-749d-4ad3-8365-eb026d693512\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.149605 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ovn-combined-ca-bundle\") pod \"1f8dd079-749d-4ad3-8365-eb026d693512\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.156277 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1f8dd079-749d-4ad3-8365-eb026d693512" (UID: "1f8dd079-749d-4ad3-8365-eb026d693512"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.156306 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ceph" (OuterVolumeSpecName: "ceph") pod "1f8dd079-749d-4ad3-8365-eb026d693512" (UID: "1f8dd079-749d-4ad3-8365-eb026d693512"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.180488 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1f8dd079-749d-4ad3-8365-eb026d693512" (UID: "1f8dd079-749d-4ad3-8365-eb026d693512"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.250661 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-inventory\") pod \"1f8dd079-749d-4ad3-8365-eb026d693512\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.250744 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f8dd079-749d-4ad3-8365-eb026d693512-ovncontroller-config-0\") pod \"1f8dd079-749d-4ad3-8365-eb026d693512\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.250766 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5k7d\" (UniqueName: \"kubernetes.io/projected/1f8dd079-749d-4ad3-8365-eb026d693512-kube-api-access-w5k7d\") pod \"1f8dd079-749d-4ad3-8365-eb026d693512\" (UID: \"1f8dd079-749d-4ad3-8365-eb026d693512\") " Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.251127 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.251141 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.251150 4666 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.253207 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8dd079-749d-4ad3-8365-eb026d693512-kube-api-access-w5k7d" (OuterVolumeSpecName: "kube-api-access-w5k7d") pod "1f8dd079-749d-4ad3-8365-eb026d693512" (UID: "1f8dd079-749d-4ad3-8365-eb026d693512"). InnerVolumeSpecName "kube-api-access-w5k7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.269811 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f8dd079-749d-4ad3-8365-eb026d693512-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1f8dd079-749d-4ad3-8365-eb026d693512" (UID: "1f8dd079-749d-4ad3-8365-eb026d693512"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.273002 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-inventory" (OuterVolumeSpecName: "inventory") pod "1f8dd079-749d-4ad3-8365-eb026d693512" (UID: "1f8dd079-749d-4ad3-8365-eb026d693512"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.353593 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f8dd079-749d-4ad3-8365-eb026d693512-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.353683 4666 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f8dd079-749d-4ad3-8365-eb026d693512-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.353705 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5k7d\" (UniqueName: \"kubernetes.io/projected/1f8dd079-749d-4ad3-8365-eb026d693512-kube-api-access-w5k7d\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.709391 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.709385 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8bt2q" event={"ID":"1f8dd079-749d-4ad3-8365-eb026d693512","Type":"ContainerDied","Data":"1e774c9b8713fa8265176d23bf79df492a582c3ef8102e70db4a85ddea68b58c"} Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.709443 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e774c9b8713fa8265176d23bf79df492a582c3ef8102e70db4a85ddea68b58c" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.709517 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-flb4n" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="registry-server" containerID="cri-o://eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc" gracePeriod=2 Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.795542 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk"] Dec 03 13:10:32 crc kubenswrapper[4666]: E1203 13:10:32.795970 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8dd079-749d-4ad3-8365-eb026d693512" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.795992 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8dd079-749d-4ad3-8365-eb026d693512" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.796208 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8dd079-749d-4ad3-8365-eb026d693512" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.796928 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.799605 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.800035 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.800309 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.800551 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.800787 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.802678 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.803095 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.806951 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk"] Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.964382 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jhsp\" (UniqueName: \"kubernetes.io/projected/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-kube-api-access-2jhsp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.964729 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.964755 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.964788 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.964865 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.964923 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:32 crc kubenswrapper[4666]: I1203 13:10:32.964953 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.066870 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.066969 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.067018 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jhsp\" (UniqueName: \"kubernetes.io/projected/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-kube-api-access-2jhsp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.067077 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.067138 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.067307 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.067455 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.071749 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.071776 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.071797 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.071771 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.072635 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.083289 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.086501 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jhsp\" (UniqueName: \"kubernetes.io/projected/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-kube-api-access-2jhsp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.130423 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.210579 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.269614 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-catalog-content\") pod \"16dda495-9a3a-48f3-a213-65422177fc9c\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.272594 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khj8b\" (UniqueName: \"kubernetes.io/projected/16dda495-9a3a-48f3-a213-65422177fc9c-kube-api-access-khj8b\") pod \"16dda495-9a3a-48f3-a213-65422177fc9c\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.272644 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-utilities\") pod \"16dda495-9a3a-48f3-a213-65422177fc9c\" (UID: \"16dda495-9a3a-48f3-a213-65422177fc9c\") " Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.273549 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-utilities" (OuterVolumeSpecName: "utilities") pod "16dda495-9a3a-48f3-a213-65422177fc9c" (UID: "16dda495-9a3a-48f3-a213-65422177fc9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.276681 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dda495-9a3a-48f3-a213-65422177fc9c-kube-api-access-khj8b" (OuterVolumeSpecName: "kube-api-access-khj8b") pod "16dda495-9a3a-48f3-a213-65422177fc9c" (UID: "16dda495-9a3a-48f3-a213-65422177fc9c"). InnerVolumeSpecName "kube-api-access-khj8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.327103 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16dda495-9a3a-48f3-a213-65422177fc9c" (UID: "16dda495-9a3a-48f3-a213-65422177fc9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.377048 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khj8b\" (UniqueName: \"kubernetes.io/projected/16dda495-9a3a-48f3-a213-65422177fc9c-kube-api-access-khj8b\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.377098 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.377110 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16dda495-9a3a-48f3-a213-65422177fc9c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.635006 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk"] Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.723024 4666 generic.go:334] "Generic (PLEG): container finished" podID="16dda495-9a3a-48f3-a213-65422177fc9c" containerID="eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc" exitCode=0 Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.723130 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flb4n" event={"ID":"16dda495-9a3a-48f3-a213-65422177fc9c","Type":"ContainerDied","Data":"eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc"} Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.723142 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flb4n" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.723181 4666 scope.go:117] "RemoveContainer" containerID="eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.723167 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flb4n" event={"ID":"16dda495-9a3a-48f3-a213-65422177fc9c","Type":"ContainerDied","Data":"075eb4d94aa3ae332702ccb19f827c71373c604d90770fb052f0ddbba509f1f0"} Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.725000 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" event={"ID":"109066d4-b3b2-4ec6-ba71-cfc35d9ca300","Type":"ContainerStarted","Data":"e105d8a92e45a3e8cbe65c785ffb9d4e398f2274914b8533674fded8b4a6e4f5"} Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.756641 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flb4n"] Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.758509 4666 scope.go:117] "RemoveContainer" containerID="f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.766912 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-flb4n"] Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.788763 4666 scope.go:117] "RemoveContainer" containerID="ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.811197 4666 scope.go:117] "RemoveContainer" containerID="eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc" Dec 03 13:10:33 crc kubenswrapper[4666]: E1203 13:10:33.811690 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc\": container with ID starting with eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc not found: ID does not exist" containerID="eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.811735 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc"} err="failed to get container status \"eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc\": rpc error: code = NotFound desc = could not find container \"eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc\": container with ID starting with eaa050de65de894dcc5758f088ee8fca51d33ed78a3c4ce169ede6c11a9e6dcc not found: ID does not exist" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.811767 4666 scope.go:117] "RemoveContainer" containerID="f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741" Dec 03 13:10:33 crc kubenswrapper[4666]: E1203 13:10:33.812141 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741\": container with ID starting with f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741 not found: ID does not exist" containerID="f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.812173 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741"} err="failed to get container status \"f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741\": rpc error: code = NotFound desc = could not find container \"f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741\": container with ID starting with f2b2f8e242bbc04f4979679870ca9ab9952f0a6e1a5e2b4e5ff4d8ae1ce1c741 not found: ID does not exist" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.812194 4666 scope.go:117] "RemoveContainer" containerID="ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4" Dec 03 13:10:33 crc kubenswrapper[4666]: E1203 13:10:33.812477 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4\": container with ID starting with ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4 not found: ID does not exist" containerID="ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4" Dec 03 13:10:33 crc kubenswrapper[4666]: I1203 13:10:33.812520 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4"} err="failed to get container status \"ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4\": rpc error: code = NotFound desc = could not find container \"ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4\": container with ID starting with ed6ea7930d819c6fd40a1f39d2a12e67a7f4d240797280b72fe981c30b2bdfd4 not found: ID does not exist" Dec 03 13:10:34 crc kubenswrapper[4666]: I1203 13:10:34.738529 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" event={"ID":"109066d4-b3b2-4ec6-ba71-cfc35d9ca300","Type":"ContainerStarted","Data":"7cd87fe26c0c9682d373fc806079a907a5555cca15a9ffc95ec1a308a9ee60f7"} Dec 03 13:10:34 crc kubenswrapper[4666]: I1203 13:10:34.760688 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" podStartSLOduration=2.239363142 podStartE2EDuration="2.760665672s" podCreationTimestamp="2025-12-03 13:10:32 +0000 UTC" firstStartedPulling="2025-12-03 13:10:33.641934814 +0000 UTC m=+3422.486895865" lastFinishedPulling="2025-12-03 13:10:34.163237344 +0000 UTC m=+3423.008198395" observedRunningTime="2025-12-03 13:10:34.7554322 +0000 UTC m=+3423.600393261" watchObservedRunningTime="2025-12-03 13:10:34.760665672 +0000 UTC m=+3423.605626723" Dec 03 13:10:35 crc kubenswrapper[4666]: I1203 13:10:35.425074 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:10:35 crc kubenswrapper[4666]: E1203 13:10:35.425563 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:10:35 crc kubenswrapper[4666]: I1203 13:10:35.437245 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" path="/var/lib/kubelet/pods/16dda495-9a3a-48f3-a213-65422177fc9c/volumes" Dec 03 13:10:48 crc kubenswrapper[4666]: I1203 13:10:48.424003 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:10:48 crc kubenswrapper[4666]: I1203 13:10:48.891932 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"98c09596cb5203ca373afe5f4a85f528da952e7cf8e58feb816920acbd8f580b"} Dec 03 13:11:34 crc kubenswrapper[4666]: I1203 13:11:34.276651 4666 generic.go:334] "Generic (PLEG): container finished" podID="109066d4-b3b2-4ec6-ba71-cfc35d9ca300" containerID="7cd87fe26c0c9682d373fc806079a907a5555cca15a9ffc95ec1a308a9ee60f7" exitCode=0 Dec 03 13:11:34 crc kubenswrapper[4666]: I1203 13:11:34.276760 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" event={"ID":"109066d4-b3b2-4ec6-ba71-cfc35d9ca300","Type":"ContainerDied","Data":"7cd87fe26c0c9682d373fc806079a907a5555cca15a9ffc95ec1a308a9ee60f7"} Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.680797 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.698074 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jhsp\" (UniqueName: \"kubernetes.io/projected/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-kube-api-access-2jhsp\") pod \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.698160 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ssh-key\") pod \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.699293 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-metadata-combined-ca-bundle\") pod \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.699397 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ceph\") pod \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.699447 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-nova-metadata-neutron-config-0\") pod \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.699478 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-ovn-metadata-agent-neutron-config-0\") pod \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.699614 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-inventory\") pod \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\" (UID: \"109066d4-b3b2-4ec6-ba71-cfc35d9ca300\") " Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.704896 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-kube-api-access-2jhsp" (OuterVolumeSpecName: "kube-api-access-2jhsp") pod "109066d4-b3b2-4ec6-ba71-cfc35d9ca300" (UID: "109066d4-b3b2-4ec6-ba71-cfc35d9ca300"). InnerVolumeSpecName "kube-api-access-2jhsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.705295 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "109066d4-b3b2-4ec6-ba71-cfc35d9ca300" (UID: "109066d4-b3b2-4ec6-ba71-cfc35d9ca300"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.706354 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ceph" (OuterVolumeSpecName: "ceph") pod "109066d4-b3b2-4ec6-ba71-cfc35d9ca300" (UID: "109066d4-b3b2-4ec6-ba71-cfc35d9ca300"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.733316 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-inventory" (OuterVolumeSpecName: "inventory") pod "109066d4-b3b2-4ec6-ba71-cfc35d9ca300" (UID: "109066d4-b3b2-4ec6-ba71-cfc35d9ca300"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.733999 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "109066d4-b3b2-4ec6-ba71-cfc35d9ca300" (UID: "109066d4-b3b2-4ec6-ba71-cfc35d9ca300"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.750281 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "109066d4-b3b2-4ec6-ba71-cfc35d9ca300" (UID: "109066d4-b3b2-4ec6-ba71-cfc35d9ca300"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.757957 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "109066d4-b3b2-4ec6-ba71-cfc35d9ca300" (UID: "109066d4-b3b2-4ec6-ba71-cfc35d9ca300"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.802572 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.802608 4666 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.802620 4666 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.802632 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.802642 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jhsp\" (UniqueName: \"kubernetes.io/projected/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-kube-api-access-2jhsp\") on node \"crc\" DevicePath \"\"" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.802652 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:11:35 crc kubenswrapper[4666]: I1203 13:11:35.802661 4666 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109066d4-b3b2-4ec6-ba71-cfc35d9ca300-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.293418 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" event={"ID":"109066d4-b3b2-4ec6-ba71-cfc35d9ca300","Type":"ContainerDied","Data":"e105d8a92e45a3e8cbe65c785ffb9d4e398f2274914b8533674fded8b4a6e4f5"} Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.293459 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e105d8a92e45a3e8cbe65c785ffb9d4e398f2274914b8533674fded8b4a6e4f5" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.293497 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.437772 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v"] Dec 03 13:11:36 crc kubenswrapper[4666]: E1203 13:11:36.438184 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="extract-utilities" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.438206 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="extract-utilities" Dec 03 13:11:36 crc kubenswrapper[4666]: E1203 13:11:36.438226 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109066d4-b3b2-4ec6-ba71-cfc35d9ca300" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.438234 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="109066d4-b3b2-4ec6-ba71-cfc35d9ca300" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 13:11:36 crc kubenswrapper[4666]: E1203 13:11:36.438248 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="registry-server" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.438256 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="registry-server" Dec 03 13:11:36 crc kubenswrapper[4666]: E1203 13:11:36.438282 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="extract-content" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.438289 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="extract-content" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.438449 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="109066d4-b3b2-4ec6-ba71-cfc35d9ca300" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.438465 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dda495-9a3a-48f3-a213-65422177fc9c" containerName="registry-server" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.439055 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.442713 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.442736 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.442867 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.442973 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.443022 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.443079 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.457460 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v"] Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.512285 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.512424 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.512483 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.512531 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.512732 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl6dp\" (UniqueName: \"kubernetes.io/projected/a165368e-be15-48d7-afad-92850b6844ea-kube-api-access-fl6dp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.512906 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.614521 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.614865 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl6dp\" (UniqueName: \"kubernetes.io/projected/a165368e-be15-48d7-afad-92850b6844ea-kube-api-access-fl6dp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.615202 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.615616 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.615685 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.615717 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.619566 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.620111 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.620293 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.620402 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.629690 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.632989 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl6dp\" (UniqueName: \"kubernetes.io/projected/a165368e-be15-48d7-afad-92850b6844ea-kube-api-access-fl6dp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:36 crc kubenswrapper[4666]: I1203 13:11:36.760157 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:11:37 crc kubenswrapper[4666]: I1203 13:11:37.243349 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v"] Dec 03 13:11:37 crc kubenswrapper[4666]: W1203 13:11:37.255850 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda165368e_be15_48d7_afad_92850b6844ea.slice/crio-caf0772041340856cd73df36aa6916713a88e64b6568de292e1073776f02aad2 WatchSource:0}: Error finding container caf0772041340856cd73df36aa6916713a88e64b6568de292e1073776f02aad2: Status 404 returned error can't find the container with id caf0772041340856cd73df36aa6916713a88e64b6568de292e1073776f02aad2 Dec 03 13:11:37 crc kubenswrapper[4666]: I1203 13:11:37.259224 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:11:37 crc kubenswrapper[4666]: I1203 13:11:37.301248 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" event={"ID":"a165368e-be15-48d7-afad-92850b6844ea","Type":"ContainerStarted","Data":"caf0772041340856cd73df36aa6916713a88e64b6568de292e1073776f02aad2"} Dec 03 13:11:38 crc kubenswrapper[4666]: I1203 13:11:38.340208 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" event={"ID":"a165368e-be15-48d7-afad-92850b6844ea","Type":"ContainerStarted","Data":"12c6015f7b5734fbcb28b83a89722b19c170ff843d0841abad4a108d0076705e"} Dec 03 13:11:38 crc kubenswrapper[4666]: I1203 13:11:38.365422 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" podStartSLOduration=1.88822298 podStartE2EDuration="2.36539853s" podCreationTimestamp="2025-12-03 13:11:36 +0000 UTC" firstStartedPulling="2025-12-03 13:11:37.258986774 +0000 UTC m=+3486.103947815" lastFinishedPulling="2025-12-03 13:11:37.736162304 +0000 UTC m=+3486.581123365" observedRunningTime="2025-12-03 13:11:38.360263871 +0000 UTC m=+3487.205224932" watchObservedRunningTime="2025-12-03 13:11:38.36539853 +0000 UTC m=+3487.210359601" Dec 03 13:13:09 crc kubenswrapper[4666]: I1203 13:13:09.866127 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:13:09 crc kubenswrapper[4666]: I1203 13:13:09.866830 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:13:39 crc kubenswrapper[4666]: I1203 13:13:39.866348 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:13:39 crc kubenswrapper[4666]: I1203 13:13:39.868595 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:14:09 crc kubenswrapper[4666]: I1203 13:14:09.866808 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:14:09 crc kubenswrapper[4666]: I1203 13:14:09.867455 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:14:09 crc kubenswrapper[4666]: I1203 13:14:09.867502 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:14:09 crc kubenswrapper[4666]: I1203 13:14:09.868274 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98c09596cb5203ca373afe5f4a85f528da952e7cf8e58feb816920acbd8f580b"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:14:09 crc kubenswrapper[4666]: I1203 13:14:09.868330 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://98c09596cb5203ca373afe5f4a85f528da952e7cf8e58feb816920acbd8f580b" gracePeriod=600 Dec 03 13:14:10 crc kubenswrapper[4666]: I1203 13:14:10.647679 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="98c09596cb5203ca373afe5f4a85f528da952e7cf8e58feb816920acbd8f580b" exitCode=0 Dec 03 13:14:10 crc kubenswrapper[4666]: I1203 13:14:10.647740 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"98c09596cb5203ca373afe5f4a85f528da952e7cf8e58feb816920acbd8f580b"} Dec 03 13:14:10 crc kubenswrapper[4666]: I1203 13:14:10.648241 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53"} Dec 03 13:14:10 crc kubenswrapper[4666]: I1203 13:14:10.648260 4666 scope.go:117] "RemoveContainer" containerID="74c31b56a64b8272758e7c1e85130cba5a4c74e9cceebe5ddd50169a4f0d32cd" Dec 03 13:14:23 crc kubenswrapper[4666]: I1203 13:14:23.813520 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lxcnb"] Dec 03 13:14:23 crc kubenswrapper[4666]: I1203 13:14:23.815933 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:23 crc kubenswrapper[4666]: I1203 13:14:23.824400 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxcnb"] Dec 03 13:14:23 crc kubenswrapper[4666]: I1203 13:14:23.900082 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-catalog-content\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:23 crc kubenswrapper[4666]: I1203 13:14:23.900415 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bf9x\" (UniqueName: \"kubernetes.io/projected/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-kube-api-access-5bf9x\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:23 crc kubenswrapper[4666]: I1203 13:14:23.900460 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-utilities\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.001773 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-catalog-content\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.001825 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bf9x\" (UniqueName: \"kubernetes.io/projected/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-kube-api-access-5bf9x\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.001880 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-utilities\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.002300 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-catalog-content\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.002371 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-utilities\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.014475 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b654x"] Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.020989 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.027858 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b654x"] Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.031884 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bf9x\" (UniqueName: \"kubernetes.io/projected/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-kube-api-access-5bf9x\") pod \"redhat-operators-lxcnb\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.103012 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lb6k\" (UniqueName: \"kubernetes.io/projected/3c5e235e-cba9-4c1c-a365-4b089a272361-kube-api-access-2lb6k\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.103055 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-catalog-content\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.103233 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-utilities\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.144604 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.206966 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-utilities\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.207071 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lb6k\" (UniqueName: \"kubernetes.io/projected/3c5e235e-cba9-4c1c-a365-4b089a272361-kube-api-access-2lb6k\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.207119 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-catalog-content\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.207612 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-catalog-content\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.207889 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-utilities\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.225446 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lb6k\" (UniqueName: \"kubernetes.io/projected/3c5e235e-cba9-4c1c-a365-4b089a272361-kube-api-access-2lb6k\") pod \"certified-operators-b654x\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.378201 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.709431 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxcnb"] Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.760677 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxcnb" event={"ID":"16eff8c1-0b0a-4438-89fc-1c4ac110b24f","Type":"ContainerStarted","Data":"bf75889af0278a6fbc32f03153232c4e6a223605ae30cac92b03b4dce98fc1ef"} Dec 03 13:14:24 crc kubenswrapper[4666]: I1203 13:14:24.763136 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b654x"] Dec 03 13:14:24 crc kubenswrapper[4666]: W1203 13:14:24.770437 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5e235e_cba9_4c1c_a365_4b089a272361.slice/crio-2bda50bec4bdefb03cebf8ce643fe0c382e1da1de528395522d8ef9e39d8e831 WatchSource:0}: Error finding container 2bda50bec4bdefb03cebf8ce643fe0c382e1da1de528395522d8ef9e39d8e831: Status 404 returned error can't find the container with id 2bda50bec4bdefb03cebf8ce643fe0c382e1da1de528395522d8ef9e39d8e831 Dec 03 13:14:25 crc kubenswrapper[4666]: I1203 13:14:25.773736 4666 generic.go:334] "Generic (PLEG): container finished" podID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerID="23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8" exitCode=0 Dec 03 13:14:25 crc kubenswrapper[4666]: I1203 13:14:25.773791 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b654x" event={"ID":"3c5e235e-cba9-4c1c-a365-4b089a272361","Type":"ContainerDied","Data":"23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8"} Dec 03 13:14:25 crc kubenswrapper[4666]: I1203 13:14:25.774174 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b654x" event={"ID":"3c5e235e-cba9-4c1c-a365-4b089a272361","Type":"ContainerStarted","Data":"2bda50bec4bdefb03cebf8ce643fe0c382e1da1de528395522d8ef9e39d8e831"} Dec 03 13:14:25 crc kubenswrapper[4666]: I1203 13:14:25.776080 4666 generic.go:334] "Generic (PLEG): container finished" podID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerID="8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34" exitCode=0 Dec 03 13:14:25 crc kubenswrapper[4666]: I1203 13:14:25.776132 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxcnb" event={"ID":"16eff8c1-0b0a-4438-89fc-1c4ac110b24f","Type":"ContainerDied","Data":"8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34"} Dec 03 13:14:26 crc kubenswrapper[4666]: I1203 13:14:26.786275 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b654x" event={"ID":"3c5e235e-cba9-4c1c-a365-4b089a272361","Type":"ContainerStarted","Data":"12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae"} Dec 03 13:14:27 crc kubenswrapper[4666]: I1203 13:14:27.809743 4666 generic.go:334] "Generic (PLEG): container finished" podID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerID="12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae" exitCode=0 Dec 03 13:14:27 crc kubenswrapper[4666]: I1203 13:14:27.809819 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b654x" event={"ID":"3c5e235e-cba9-4c1c-a365-4b089a272361","Type":"ContainerDied","Data":"12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae"} Dec 03 13:14:27 crc kubenswrapper[4666]: I1203 13:14:27.814166 4666 generic.go:334] "Generic (PLEG): container finished" podID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerID="bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37" exitCode=0 Dec 03 13:14:27 crc kubenswrapper[4666]: I1203 13:14:27.814200 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxcnb" event={"ID":"16eff8c1-0b0a-4438-89fc-1c4ac110b24f","Type":"ContainerDied","Data":"bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37"} Dec 03 13:14:28 crc kubenswrapper[4666]: I1203 13:14:28.828255 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxcnb" event={"ID":"16eff8c1-0b0a-4438-89fc-1c4ac110b24f","Type":"ContainerStarted","Data":"6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65"} Dec 03 13:14:28 crc kubenswrapper[4666]: I1203 13:14:28.834070 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b654x" event={"ID":"3c5e235e-cba9-4c1c-a365-4b089a272361","Type":"ContainerStarted","Data":"a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece"} Dec 03 13:14:28 crc kubenswrapper[4666]: I1203 13:14:28.853466 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lxcnb" podStartSLOduration=3.389390822 podStartE2EDuration="5.853446561s" podCreationTimestamp="2025-12-03 13:14:23 +0000 UTC" firstStartedPulling="2025-12-03 13:14:25.780374258 +0000 UTC m=+3654.625335309" lastFinishedPulling="2025-12-03 13:14:28.244429977 +0000 UTC m=+3657.089391048" observedRunningTime="2025-12-03 13:14:28.842958948 +0000 UTC m=+3657.687920009" watchObservedRunningTime="2025-12-03 13:14:28.853446561 +0000 UTC m=+3657.698407612" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.146216 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.146851 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.203719 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.230968 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b654x" podStartSLOduration=8.713539626 podStartE2EDuration="11.230951412s" podCreationTimestamp="2025-12-03 13:14:23 +0000 UTC" firstStartedPulling="2025-12-03 13:14:25.77822047 +0000 UTC m=+3654.623181561" lastFinishedPulling="2025-12-03 13:14:28.295632266 +0000 UTC m=+3657.140593347" observedRunningTime="2025-12-03 13:14:28.869946855 +0000 UTC m=+3657.714907906" watchObservedRunningTime="2025-12-03 13:14:34.230951412 +0000 UTC m=+3663.075912473" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.379238 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.379282 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.425188 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.948198 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:34 crc kubenswrapper[4666]: I1203 13:14:34.960282 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:35 crc kubenswrapper[4666]: I1203 13:14:35.841843 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxcnb"] Dec 03 13:14:36 crc kubenswrapper[4666]: I1203 13:14:36.907996 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lxcnb" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="registry-server" containerID="cri-o://6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65" gracePeriod=2 Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.245726 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b654x"] Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.246729 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b654x" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="registry-server" containerID="cri-o://a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece" gracePeriod=2 Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.467914 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.637302 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.660064 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bf9x\" (UniqueName: \"kubernetes.io/projected/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-kube-api-access-5bf9x\") pod \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.660185 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-catalog-content\") pod \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.660305 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-utilities\") pod \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\" (UID: \"16eff8c1-0b0a-4438-89fc-1c4ac110b24f\") " Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.665185 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-utilities" (OuterVolumeSpecName: "utilities") pod "16eff8c1-0b0a-4438-89fc-1c4ac110b24f" (UID: "16eff8c1-0b0a-4438-89fc-1c4ac110b24f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.667443 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-kube-api-access-5bf9x" (OuterVolumeSpecName: "kube-api-access-5bf9x") pod "16eff8c1-0b0a-4438-89fc-1c4ac110b24f" (UID: "16eff8c1-0b0a-4438-89fc-1c4ac110b24f"). InnerVolumeSpecName "kube-api-access-5bf9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.763860 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lb6k\" (UniqueName: \"kubernetes.io/projected/3c5e235e-cba9-4c1c-a365-4b089a272361-kube-api-access-2lb6k\") pod \"3c5e235e-cba9-4c1c-a365-4b089a272361\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.763971 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-utilities\") pod \"3c5e235e-cba9-4c1c-a365-4b089a272361\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.764126 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-catalog-content\") pod \"3c5e235e-cba9-4c1c-a365-4b089a272361\" (UID: \"3c5e235e-cba9-4c1c-a365-4b089a272361\") " Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.764841 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-utilities" (OuterVolumeSpecName: "utilities") pod "3c5e235e-cba9-4c1c-a365-4b089a272361" (UID: "3c5e235e-cba9-4c1c-a365-4b089a272361"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.764853 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.764887 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bf9x\" (UniqueName: \"kubernetes.io/projected/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-kube-api-access-5bf9x\") on node \"crc\" DevicePath \"\"" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.767171 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5e235e-cba9-4c1c-a365-4b089a272361-kube-api-access-2lb6k" (OuterVolumeSpecName: "kube-api-access-2lb6k") pod "3c5e235e-cba9-4c1c-a365-4b089a272361" (UID: "3c5e235e-cba9-4c1c-a365-4b089a272361"). InnerVolumeSpecName "kube-api-access-2lb6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.771803 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16eff8c1-0b0a-4438-89fc-1c4ac110b24f" (UID: "16eff8c1-0b0a-4438-89fc-1c4ac110b24f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.809279 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c5e235e-cba9-4c1c-a365-4b089a272361" (UID: "3c5e235e-cba9-4c1c-a365-4b089a272361"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.867596 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.867636 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lb6k\" (UniqueName: \"kubernetes.io/projected/3c5e235e-cba9-4c1c-a365-4b089a272361-kube-api-access-2lb6k\") on node \"crc\" DevicePath \"\"" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.867649 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16eff8c1-0b0a-4438-89fc-1c4ac110b24f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.867658 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e235e-cba9-4c1c-a365-4b089a272361-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.918044 4666 generic.go:334] "Generic (PLEG): container finished" podID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerID="a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece" exitCode=0 Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.918114 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b654x" event={"ID":"3c5e235e-cba9-4c1c-a365-4b089a272361","Type":"ContainerDied","Data":"a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece"} Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.918158 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b654x" event={"ID":"3c5e235e-cba9-4c1c-a365-4b089a272361","Type":"ContainerDied","Data":"2bda50bec4bdefb03cebf8ce643fe0c382e1da1de528395522d8ef9e39d8e831"} Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.918156 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b654x" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.918180 4666 scope.go:117] "RemoveContainer" containerID="a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.922382 4666 generic.go:334] "Generic (PLEG): container finished" podID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerID="6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65" exitCode=0 Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.922406 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxcnb" event={"ID":"16eff8c1-0b0a-4438-89fc-1c4ac110b24f","Type":"ContainerDied","Data":"6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65"} Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.922422 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxcnb" event={"ID":"16eff8c1-0b0a-4438-89fc-1c4ac110b24f","Type":"ContainerDied","Data":"bf75889af0278a6fbc32f03153232c4e6a223605ae30cac92b03b4dce98fc1ef"} Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.922475 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxcnb" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.942299 4666 scope.go:117] "RemoveContainer" containerID="12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.976170 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxcnb"] Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.984181 4666 scope.go:117] "RemoveContainer" containerID="23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8" Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.990320 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lxcnb"] Dec 03 13:14:37 crc kubenswrapper[4666]: I1203 13:14:37.999953 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b654x"] Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.005556 4666 scope.go:117] "RemoveContainer" containerID="a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece" Dec 03 13:14:38 crc kubenswrapper[4666]: E1203 13:14:38.007120 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece\": container with ID starting with a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece not found: ID does not exist" containerID="a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.007186 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece"} err="failed to get container status \"a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece\": rpc error: code = NotFound desc = could not find container \"a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece\": container with ID starting with a293d276e673c59a5073711a7a1c031e8fc2a24d3d7432011ccf80580fdb0ece not found: ID does not exist" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.007225 4666 scope.go:117] "RemoveContainer" containerID="12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae" Dec 03 13:14:38 crc kubenswrapper[4666]: E1203 13:14:38.008884 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae\": container with ID starting with 12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae not found: ID does not exist" containerID="12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.008921 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae"} err="failed to get container status \"12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae\": rpc error: code = NotFound desc = could not find container \"12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae\": container with ID starting with 12f6379b9751eba21f9d8a087f4fce79076f52f3e9f199c5f22404521bee0bae not found: ID does not exist" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.008942 4666 scope.go:117] "RemoveContainer" containerID="23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.009392 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b654x"] Dec 03 13:14:38 crc kubenswrapper[4666]: E1203 13:14:38.010228 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8\": container with ID starting with 23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8 not found: ID does not exist" containerID="23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.010263 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8"} err="failed to get container status \"23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8\": rpc error: code = NotFound desc = could not find container \"23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8\": container with ID starting with 23a9bd7e0842648abbfd27a5210f52d1ffa691a0d925af46889516780b41d5f8 not found: ID does not exist" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.010282 4666 scope.go:117] "RemoveContainer" containerID="6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.073375 4666 scope.go:117] "RemoveContainer" containerID="bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.092863 4666 scope.go:117] "RemoveContainer" containerID="8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.137641 4666 scope.go:117] "RemoveContainer" containerID="6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65" Dec 03 13:14:38 crc kubenswrapper[4666]: E1203 13:14:38.138055 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65\": container with ID starting with 6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65 not found: ID does not exist" containerID="6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.138088 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65"} err="failed to get container status \"6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65\": rpc error: code = NotFound desc = could not find container \"6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65\": container with ID starting with 6407394d74698214cc60c899a37bee0c360197e7a3e02e0325fcd988e112be65 not found: ID does not exist" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.138122 4666 scope.go:117] "RemoveContainer" containerID="bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37" Dec 03 13:14:38 crc kubenswrapper[4666]: E1203 13:14:38.138375 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37\": container with ID starting with bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37 not found: ID does not exist" containerID="bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.138401 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37"} err="failed to get container status \"bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37\": rpc error: code = NotFound desc = could not find container \"bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37\": container with ID starting with bde618af6b39daec707d00ed92e2c825f91ac1f26e515c2c175b980a232b4a37 not found: ID does not exist" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.138416 4666 scope.go:117] "RemoveContainer" containerID="8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34" Dec 03 13:14:38 crc kubenswrapper[4666]: E1203 13:14:38.138693 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34\": container with ID starting with 8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34 not found: ID does not exist" containerID="8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34" Dec 03 13:14:38 crc kubenswrapper[4666]: I1203 13:14:38.138713 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34"} err="failed to get container status \"8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34\": rpc error: code = NotFound desc = could not find container \"8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34\": container with ID starting with 8a2efee29a0830eecc9b29b50925e38bf2ccb079d4914dfa56f284f9f457ca34 not found: ID does not exist" Dec 03 13:14:39 crc kubenswrapper[4666]: I1203 13:14:39.434344 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" path="/var/lib/kubelet/pods/16eff8c1-0b0a-4438-89fc-1c4ac110b24f/volumes" Dec 03 13:14:39 crc kubenswrapper[4666]: I1203 13:14:39.435363 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" path="/var/lib/kubelet/pods/3c5e235e-cba9-4c1c-a365-4b089a272361/volumes" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.157477 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5"] Dec 03 13:15:00 crc kubenswrapper[4666]: E1203 13:15:00.158463 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="extract-content" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158494 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="extract-content" Dec 03 13:15:00 crc kubenswrapper[4666]: E1203 13:15:00.158504 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="registry-server" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158509 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="registry-server" Dec 03 13:15:00 crc kubenswrapper[4666]: E1203 13:15:00.158522 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="extract-content" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158528 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="extract-content" Dec 03 13:15:00 crc kubenswrapper[4666]: E1203 13:15:00.158539 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="registry-server" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158545 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="registry-server" Dec 03 13:15:00 crc kubenswrapper[4666]: E1203 13:15:00.158589 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="extract-utilities" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158596 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="extract-utilities" Dec 03 13:15:00 crc kubenswrapper[4666]: E1203 13:15:00.158602 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="extract-utilities" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158608 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="extract-utilities" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158828 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="16eff8c1-0b0a-4438-89fc-1c4ac110b24f" containerName="registry-server" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.158851 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5e235e-cba9-4c1c-a365-4b089a272361" containerName="registry-server" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.159803 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.162955 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.163031 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.181993 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5"] Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.211401 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88615659-1f7a-4d7e-ba15-d3f89534b454-secret-volume\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.211482 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ppkw\" (UniqueName: \"kubernetes.io/projected/88615659-1f7a-4d7e-ba15-d3f89534b454-kube-api-access-8ppkw\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.211510 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88615659-1f7a-4d7e-ba15-d3f89534b454-config-volume\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.313237 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88615659-1f7a-4d7e-ba15-d3f89534b454-secret-volume\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.313310 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ppkw\" (UniqueName: \"kubernetes.io/projected/88615659-1f7a-4d7e-ba15-d3f89534b454-kube-api-access-8ppkw\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.313345 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88615659-1f7a-4d7e-ba15-d3f89534b454-config-volume\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.314420 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88615659-1f7a-4d7e-ba15-d3f89534b454-config-volume\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.319986 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88615659-1f7a-4d7e-ba15-d3f89534b454-secret-volume\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.329818 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ppkw\" (UniqueName: \"kubernetes.io/projected/88615659-1f7a-4d7e-ba15-d3f89534b454-kube-api-access-8ppkw\") pod \"collect-profiles-29412795-b6dh5\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.480518 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:00 crc kubenswrapper[4666]: I1203 13:15:00.900099 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5"] Dec 03 13:15:01 crc kubenswrapper[4666]: I1203 13:15:01.140251 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" event={"ID":"88615659-1f7a-4d7e-ba15-d3f89534b454","Type":"ContainerStarted","Data":"0e2526aa90c73db388436fc1750c689248ccad689ea2d8c23cd1aada98be820b"} Dec 03 13:15:01 crc kubenswrapper[4666]: I1203 13:15:01.140611 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" event={"ID":"88615659-1f7a-4d7e-ba15-d3f89534b454","Type":"ContainerStarted","Data":"81db8c54c03cf2ca81e85a691a21cef155b20ada49efbac980acfeac0c00aeb7"} Dec 03 13:15:01 crc kubenswrapper[4666]: I1203 13:15:01.162112 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" podStartSLOduration=1.162070695 podStartE2EDuration="1.162070695s" podCreationTimestamp="2025-12-03 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:15:01.155919979 +0000 UTC m=+3690.000881050" watchObservedRunningTime="2025-12-03 13:15:01.162070695 +0000 UTC m=+3690.007031746" Dec 03 13:15:02 crc kubenswrapper[4666]: I1203 13:15:02.152171 4666 generic.go:334] "Generic (PLEG): container finished" podID="88615659-1f7a-4d7e-ba15-d3f89534b454" containerID="0e2526aa90c73db388436fc1750c689248ccad689ea2d8c23cd1aada98be820b" exitCode=0 Dec 03 13:15:02 crc kubenswrapper[4666]: I1203 13:15:02.152220 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" event={"ID":"88615659-1f7a-4d7e-ba15-d3f89534b454","Type":"ContainerDied","Data":"0e2526aa90c73db388436fc1750c689248ccad689ea2d8c23cd1aada98be820b"} Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.524482 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.565658 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88615659-1f7a-4d7e-ba15-d3f89534b454-config-volume\") pod \"88615659-1f7a-4d7e-ba15-d3f89534b454\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.565720 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88615659-1f7a-4d7e-ba15-d3f89534b454-secret-volume\") pod \"88615659-1f7a-4d7e-ba15-d3f89534b454\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.565783 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ppkw\" (UniqueName: \"kubernetes.io/projected/88615659-1f7a-4d7e-ba15-d3f89534b454-kube-api-access-8ppkw\") pod \"88615659-1f7a-4d7e-ba15-d3f89534b454\" (UID: \"88615659-1f7a-4d7e-ba15-d3f89534b454\") " Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.566723 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88615659-1f7a-4d7e-ba15-d3f89534b454-config-volume" (OuterVolumeSpecName: "config-volume") pod "88615659-1f7a-4d7e-ba15-d3f89534b454" (UID: "88615659-1f7a-4d7e-ba15-d3f89534b454"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.571866 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88615659-1f7a-4d7e-ba15-d3f89534b454-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "88615659-1f7a-4d7e-ba15-d3f89534b454" (UID: "88615659-1f7a-4d7e-ba15-d3f89534b454"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.572357 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88615659-1f7a-4d7e-ba15-d3f89534b454-kube-api-access-8ppkw" (OuterVolumeSpecName: "kube-api-access-8ppkw") pod "88615659-1f7a-4d7e-ba15-d3f89534b454" (UID: "88615659-1f7a-4d7e-ba15-d3f89534b454"). InnerVolumeSpecName "kube-api-access-8ppkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.667376 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ppkw\" (UniqueName: \"kubernetes.io/projected/88615659-1f7a-4d7e-ba15-d3f89534b454-kube-api-access-8ppkw\") on node \"crc\" DevicePath \"\"" Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.667426 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88615659-1f7a-4d7e-ba15-d3f89534b454-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:15:03 crc kubenswrapper[4666]: I1203 13:15:03.667437 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88615659-1f7a-4d7e-ba15-d3f89534b454-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:15:04 crc kubenswrapper[4666]: I1203 13:15:04.168408 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" event={"ID":"88615659-1f7a-4d7e-ba15-d3f89534b454","Type":"ContainerDied","Data":"81db8c54c03cf2ca81e85a691a21cef155b20ada49efbac980acfeac0c00aeb7"} Dec 03 13:15:04 crc kubenswrapper[4666]: I1203 13:15:04.168442 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5" Dec 03 13:15:04 crc kubenswrapper[4666]: I1203 13:15:04.168454 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81db8c54c03cf2ca81e85a691a21cef155b20ada49efbac980acfeac0c00aeb7" Dec 03 13:15:04 crc kubenswrapper[4666]: I1203 13:15:04.235216 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt"] Dec 03 13:15:04 crc kubenswrapper[4666]: I1203 13:15:04.242570 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-ds6lt"] Dec 03 13:15:05 crc kubenswrapper[4666]: I1203 13:15:05.435450 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274f78bd-4167-45f6-9b52-bbb47d6ce388" path="/var/lib/kubelet/pods/274f78bd-4167-45f6-9b52-bbb47d6ce388/volumes" Dec 03 13:15:50 crc kubenswrapper[4666]: I1203 13:15:50.591561 4666 scope.go:117] "RemoveContainer" containerID="f023f849ad36ddfea7f6a9cb8aef15f795eb86afe4becf441c67a153102fbd1a" Dec 03 13:16:09 crc kubenswrapper[4666]: I1203 13:16:09.709541 4666 generic.go:334] "Generic (PLEG): container finished" podID="a165368e-be15-48d7-afad-92850b6844ea" containerID="12c6015f7b5734fbcb28b83a89722b19c170ff843d0841abad4a108d0076705e" exitCode=0 Dec 03 13:16:09 crc kubenswrapper[4666]: I1203 13:16:09.709605 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" event={"ID":"a165368e-be15-48d7-afad-92850b6844ea","Type":"ContainerDied","Data":"12c6015f7b5734fbcb28b83a89722b19c170ff843d0841abad4a108d0076705e"} Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.238962 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.306236 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ceph\") pod \"a165368e-be15-48d7-afad-92850b6844ea\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.306394 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-combined-ca-bundle\") pod \"a165368e-be15-48d7-afad-92850b6844ea\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.308450 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl6dp\" (UniqueName: \"kubernetes.io/projected/a165368e-be15-48d7-afad-92850b6844ea-kube-api-access-fl6dp\") pod \"a165368e-be15-48d7-afad-92850b6844ea\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.308615 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-secret-0\") pod \"a165368e-be15-48d7-afad-92850b6844ea\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.308658 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ssh-key\") pod \"a165368e-be15-48d7-afad-92850b6844ea\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.308750 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-inventory\") pod \"a165368e-be15-48d7-afad-92850b6844ea\" (UID: \"a165368e-be15-48d7-afad-92850b6844ea\") " Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.314325 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a165368e-be15-48d7-afad-92850b6844ea" (UID: "a165368e-be15-48d7-afad-92850b6844ea"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.314971 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ceph" (OuterVolumeSpecName: "ceph") pod "a165368e-be15-48d7-afad-92850b6844ea" (UID: "a165368e-be15-48d7-afad-92850b6844ea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.318978 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a165368e-be15-48d7-afad-92850b6844ea-kube-api-access-fl6dp" (OuterVolumeSpecName: "kube-api-access-fl6dp") pod "a165368e-be15-48d7-afad-92850b6844ea" (UID: "a165368e-be15-48d7-afad-92850b6844ea"). InnerVolumeSpecName "kube-api-access-fl6dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.338244 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a165368e-be15-48d7-afad-92850b6844ea" (UID: "a165368e-be15-48d7-afad-92850b6844ea"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.338300 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a165368e-be15-48d7-afad-92850b6844ea" (UID: "a165368e-be15-48d7-afad-92850b6844ea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.338647 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-inventory" (OuterVolumeSpecName: "inventory") pod "a165368e-be15-48d7-afad-92850b6844ea" (UID: "a165368e-be15-48d7-afad-92850b6844ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.413650 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl6dp\" (UniqueName: \"kubernetes.io/projected/a165368e-be15-48d7-afad-92850b6844ea-kube-api-access-fl6dp\") on node \"crc\" DevicePath \"\"" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.413708 4666 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.413720 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.413732 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.413743 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.413754 4666 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a165368e-be15-48d7-afad-92850b6844ea-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.730849 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" event={"ID":"a165368e-be15-48d7-afad-92850b6844ea","Type":"ContainerDied","Data":"caf0772041340856cd73df36aa6916713a88e64b6568de292e1073776f02aad2"} Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.730897 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caf0772041340856cd73df36aa6916713a88e64b6568de292e1073776f02aad2" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.730906 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.825105 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7"] Dec 03 13:16:11 crc kubenswrapper[4666]: E1203 13:16:11.825499 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a165368e-be15-48d7-afad-92850b6844ea" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.825519 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="a165368e-be15-48d7-afad-92850b6844ea" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 13:16:11 crc kubenswrapper[4666]: E1203 13:16:11.825538 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88615659-1f7a-4d7e-ba15-d3f89534b454" containerName="collect-profiles" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.825544 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="88615659-1f7a-4d7e-ba15-d3f89534b454" containerName="collect-profiles" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.825714 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="a165368e-be15-48d7-afad-92850b6844ea" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.825734 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="88615659-1f7a-4d7e-ba15-d3f89534b454" containerName="collect-profiles" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.826860 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.828720 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.832191 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.832639 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2mmd" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.832816 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.832994 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.833224 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.833536 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.833784 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.833970 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.838365 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7"] Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922360 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922420 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922443 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922482 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922508 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922654 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922857 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.922936 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.923069 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.923121 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:11 crc kubenswrapper[4666]: I1203 13:16:11.923197 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtnfh\" (UniqueName: \"kubernetes.io/projected/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-kube-api-access-gtnfh\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.024831 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025248 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025412 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025522 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025674 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025762 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025850 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtnfh\" (UniqueName: \"kubernetes.io/projected/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-kube-api-access-gtnfh\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025947 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025920 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.025963 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.026191 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.026279 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.026376 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.029659 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.029746 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.029827 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.030331 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.030361 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.031223 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.034057 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.036717 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.044560 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtnfh\" (UniqueName: \"kubernetes.io/projected/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-kube-api-access-gtnfh\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.177546 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.662826 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7"] Dec 03 13:16:12 crc kubenswrapper[4666]: W1203 13:16:12.668911 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfec4a43_8c2c_4dab_b3c1_2bc56e71d330.slice/crio-1110405ae12e8a3bcf0cc10186c27889622d0c8d7efdd03c1708357f6ad7f563 WatchSource:0}: Error finding container 1110405ae12e8a3bcf0cc10186c27889622d0c8d7efdd03c1708357f6ad7f563: Status 404 returned error can't find the container with id 1110405ae12e8a3bcf0cc10186c27889622d0c8d7efdd03c1708357f6ad7f563 Dec 03 13:16:12 crc kubenswrapper[4666]: I1203 13:16:12.740477 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" event={"ID":"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330","Type":"ContainerStarted","Data":"1110405ae12e8a3bcf0cc10186c27889622d0c8d7efdd03c1708357f6ad7f563"} Dec 03 13:16:14 crc kubenswrapper[4666]: I1203 13:16:14.759469 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" event={"ID":"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330","Type":"ContainerStarted","Data":"1b56899461d28dd91a54d33841e098a880c755b74daccf4d922f569554d3fb04"} Dec 03 13:16:14 crc kubenswrapper[4666]: I1203 13:16:14.783229 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" podStartSLOduration=3.311820084 podStartE2EDuration="3.783071147s" podCreationTimestamp="2025-12-03 13:16:11 +0000 UTC" firstStartedPulling="2025-12-03 13:16:12.671294158 +0000 UTC m=+3761.516255209" lastFinishedPulling="2025-12-03 13:16:13.142545231 +0000 UTC m=+3761.987506272" observedRunningTime="2025-12-03 13:16:14.778946756 +0000 UTC m=+3763.623907807" watchObservedRunningTime="2025-12-03 13:16:14.783071147 +0000 UTC m=+3763.628032198" Dec 03 13:16:39 crc kubenswrapper[4666]: I1203 13:16:39.866476 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:16:39 crc kubenswrapper[4666]: I1203 13:16:39.867025 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:17:09 crc kubenswrapper[4666]: I1203 13:17:09.866612 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:17:09 crc kubenswrapper[4666]: I1203 13:17:09.867159 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:17:39 crc kubenswrapper[4666]: I1203 13:17:39.865708 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:17:39 crc kubenswrapper[4666]: I1203 13:17:39.866335 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:17:39 crc kubenswrapper[4666]: I1203 13:17:39.866391 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:17:39 crc kubenswrapper[4666]: I1203 13:17:39.867259 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:17:39 crc kubenswrapper[4666]: I1203 13:17:39.867327 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" gracePeriod=600 Dec 03 13:17:39 crc kubenswrapper[4666]: E1203 13:17:39.999632 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:17:40 crc kubenswrapper[4666]: I1203 13:17:40.499212 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" exitCode=0 Dec 03 13:17:40 crc kubenswrapper[4666]: I1203 13:17:40.499260 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53"} Dec 03 13:17:40 crc kubenswrapper[4666]: I1203 13:17:40.499301 4666 scope.go:117] "RemoveContainer" containerID="98c09596cb5203ca373afe5f4a85f528da952e7cf8e58feb816920acbd8f580b" Dec 03 13:17:40 crc kubenswrapper[4666]: I1203 13:17:40.500316 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:17:40 crc kubenswrapper[4666]: E1203 13:17:40.500731 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:17:55 crc kubenswrapper[4666]: I1203 13:17:55.425320 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:17:55 crc kubenswrapper[4666]: E1203 13:17:55.426669 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:18:07 crc kubenswrapper[4666]: I1203 13:18:07.423591 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:18:07 crc kubenswrapper[4666]: E1203 13:18:07.424441 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:18:18 crc kubenswrapper[4666]: I1203 13:18:18.423582 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:18:18 crc kubenswrapper[4666]: E1203 13:18:18.424358 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:18:32 crc kubenswrapper[4666]: I1203 13:18:32.424330 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:18:32 crc kubenswrapper[4666]: E1203 13:18:32.425622 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:18:42 crc kubenswrapper[4666]: I1203 13:18:42.918860 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pg4f9"] Dec 03 13:18:42 crc kubenswrapper[4666]: I1203 13:18:42.921642 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:42 crc kubenswrapper[4666]: I1203 13:18:42.933870 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pg4f9"] Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.060146 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz7cq\" (UniqueName: \"kubernetes.io/projected/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-kube-api-access-xz7cq\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.060416 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-utilities\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.060513 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-catalog-content\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.161787 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-utilities\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.161852 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-catalog-content\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.161927 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz7cq\" (UniqueName: \"kubernetes.io/projected/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-kube-api-access-xz7cq\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.162498 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-utilities\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.162526 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-catalog-content\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.179949 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz7cq\" (UniqueName: \"kubernetes.io/projected/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-kube-api-access-xz7cq\") pod \"redhat-marketplace-pg4f9\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.246332 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:43 crc kubenswrapper[4666]: I1203 13:18:43.703703 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pg4f9"] Dec 03 13:18:43 crc kubenswrapper[4666]: W1203 13:18:43.706498 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa856a8f_e2b1_46ae_bc18_6e217fd2e4d6.slice/crio-e84f5ea26b7920ae5bc268057d73ec8e33f770a82ca3b3c1377b3cb91593bb5b WatchSource:0}: Error finding container e84f5ea26b7920ae5bc268057d73ec8e33f770a82ca3b3c1377b3cb91593bb5b: Status 404 returned error can't find the container with id e84f5ea26b7920ae5bc268057d73ec8e33f770a82ca3b3c1377b3cb91593bb5b Dec 03 13:18:44 crc kubenswrapper[4666]: I1203 13:18:44.053078 4666 generic.go:334] "Generic (PLEG): container finished" podID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerID="ad4a3118ae58154fbf305cc96719ca430081844719adc4f5384efe6e97475081" exitCode=0 Dec 03 13:18:44 crc kubenswrapper[4666]: I1203 13:18:44.053155 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pg4f9" event={"ID":"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6","Type":"ContainerDied","Data":"ad4a3118ae58154fbf305cc96719ca430081844719adc4f5384efe6e97475081"} Dec 03 13:18:44 crc kubenswrapper[4666]: I1203 13:18:44.054682 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pg4f9" event={"ID":"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6","Type":"ContainerStarted","Data":"e84f5ea26b7920ae5bc268057d73ec8e33f770a82ca3b3c1377b3cb91593bb5b"} Dec 03 13:18:44 crc kubenswrapper[4666]: I1203 13:18:44.055130 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:18:45 crc kubenswrapper[4666]: I1203 13:18:45.064188 4666 generic.go:334] "Generic (PLEG): container finished" podID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerID="880c2b70f1db7dbfa624cc1361ccb3e99d3e43a1d63121fe2272f8fae6d2d5f0" exitCode=0 Dec 03 13:18:45 crc kubenswrapper[4666]: I1203 13:18:45.064312 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pg4f9" event={"ID":"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6","Type":"ContainerDied","Data":"880c2b70f1db7dbfa624cc1361ccb3e99d3e43a1d63121fe2272f8fae6d2d5f0"} Dec 03 13:18:45 crc kubenswrapper[4666]: I1203 13:18:45.424648 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:18:45 crc kubenswrapper[4666]: E1203 13:18:45.424942 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:18:46 crc kubenswrapper[4666]: I1203 13:18:46.086778 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pg4f9" event={"ID":"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6","Type":"ContainerStarted","Data":"d5b494d7475662d0be919e4f336c1f1ca862661479b96f528d3d2e599319f947"} Dec 03 13:18:46 crc kubenswrapper[4666]: I1203 13:18:46.104727 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pg4f9" podStartSLOduration=2.671058308 podStartE2EDuration="4.104708061s" podCreationTimestamp="2025-12-03 13:18:42 +0000 UTC" firstStartedPulling="2025-12-03 13:18:44.054899821 +0000 UTC m=+3912.899860862" lastFinishedPulling="2025-12-03 13:18:45.488549564 +0000 UTC m=+3914.333510615" observedRunningTime="2025-12-03 13:18:46.102252605 +0000 UTC m=+3914.947213676" watchObservedRunningTime="2025-12-03 13:18:46.104708061 +0000 UTC m=+3914.949669112" Dec 03 13:18:53 crc kubenswrapper[4666]: I1203 13:18:53.247201 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:53 crc kubenswrapper[4666]: I1203 13:18:53.248544 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:53 crc kubenswrapper[4666]: I1203 13:18:53.303718 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:54 crc kubenswrapper[4666]: I1203 13:18:54.194496 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:54 crc kubenswrapper[4666]: I1203 13:18:54.240641 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pg4f9"] Dec 03 13:18:56 crc kubenswrapper[4666]: I1203 13:18:56.167527 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pg4f9" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="registry-server" containerID="cri-o://d5b494d7475662d0be919e4f336c1f1ca862661479b96f528d3d2e599319f947" gracePeriod=2 Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.178904 4666 generic.go:334] "Generic (PLEG): container finished" podID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerID="d5b494d7475662d0be919e4f336c1f1ca862661479b96f528d3d2e599319f947" exitCode=0 Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.178991 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pg4f9" event={"ID":"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6","Type":"ContainerDied","Data":"d5b494d7475662d0be919e4f336c1f1ca862661479b96f528d3d2e599319f947"} Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.344821 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.541059 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-catalog-content\") pod \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.541128 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-utilities\") pod \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.541293 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz7cq\" (UniqueName: \"kubernetes.io/projected/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-kube-api-access-xz7cq\") pod \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\" (UID: \"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6\") " Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.542387 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-utilities" (OuterVolumeSpecName: "utilities") pod "fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" (UID: "fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.543690 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.547219 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-kube-api-access-xz7cq" (OuterVolumeSpecName: "kube-api-access-xz7cq") pod "fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" (UID: "fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6"). InnerVolumeSpecName "kube-api-access-xz7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.563659 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" (UID: "fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.645149 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz7cq\" (UniqueName: \"kubernetes.io/projected/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-kube-api-access-xz7cq\") on node \"crc\" DevicePath \"\"" Dec 03 13:18:57 crc kubenswrapper[4666]: I1203 13:18:57.645666 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:18:58 crc kubenswrapper[4666]: I1203 13:18:58.190898 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pg4f9" event={"ID":"fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6","Type":"ContainerDied","Data":"e84f5ea26b7920ae5bc268057d73ec8e33f770a82ca3b3c1377b3cb91593bb5b"} Dec 03 13:18:58 crc kubenswrapper[4666]: I1203 13:18:58.190951 4666 scope.go:117] "RemoveContainer" containerID="d5b494d7475662d0be919e4f336c1f1ca862661479b96f528d3d2e599319f947" Dec 03 13:18:58 crc kubenswrapper[4666]: I1203 13:18:58.191727 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pg4f9" Dec 03 13:18:58 crc kubenswrapper[4666]: I1203 13:18:58.212378 4666 scope.go:117] "RemoveContainer" containerID="880c2b70f1db7dbfa624cc1361ccb3e99d3e43a1d63121fe2272f8fae6d2d5f0" Dec 03 13:18:58 crc kubenswrapper[4666]: I1203 13:18:58.228119 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pg4f9"] Dec 03 13:18:58 crc kubenswrapper[4666]: I1203 13:18:58.242647 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pg4f9"] Dec 03 13:18:58 crc kubenswrapper[4666]: I1203 13:18:58.251984 4666 scope.go:117] "RemoveContainer" containerID="ad4a3118ae58154fbf305cc96719ca430081844719adc4f5384efe6e97475081" Dec 03 13:18:59 crc kubenswrapper[4666]: I1203 13:18:59.432239 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" path="/var/lib/kubelet/pods/fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6/volumes" Dec 03 13:19:00 crc kubenswrapper[4666]: I1203 13:19:00.423887 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:19:00 crc kubenswrapper[4666]: E1203 13:19:00.424106 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:19:11 crc kubenswrapper[4666]: I1203 13:19:11.431065 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:19:11 crc kubenswrapper[4666]: E1203 13:19:11.432106 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:19:15 crc kubenswrapper[4666]: I1203 13:19:15.333945 4666 generic.go:334] "Generic (PLEG): container finished" podID="dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" containerID="1b56899461d28dd91a54d33841e098a880c755b74daccf4d922f569554d3fb04" exitCode=0 Dec 03 13:19:15 crc kubenswrapper[4666]: I1203 13:19:15.334121 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" event={"ID":"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330","Type":"ContainerDied","Data":"1b56899461d28dd91a54d33841e098a880c755b74daccf4d922f569554d3fb04"} Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.718036 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802396 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-1\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802446 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-extra-config-0\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802470 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-inventory\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802493 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-1\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802514 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802546 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ssh-key\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802579 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-0\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802594 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-0\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802618 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-custom-ceph-combined-ca-bundle\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802664 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtnfh\" (UniqueName: \"kubernetes.io/projected/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-kube-api-access-gtnfh\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.802726 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph-nova-0\") pod \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\" (UID: \"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330\") " Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.808773 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph" (OuterVolumeSpecName: "ceph") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.812050 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.813023 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-kube-api-access-gtnfh" (OuterVolumeSpecName: "kube-api-access-gtnfh") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "kube-api-access-gtnfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.828322 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.828752 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.830812 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.835456 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-inventory" (OuterVolumeSpecName: "inventory") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.835904 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.836414 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.837895 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.847887 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" (UID: "dfec4a43-8c2c-4dab-b3c1-2bc56e71d330"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905879 4666 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905921 4666 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905935 4666 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905948 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905959 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905970 4666 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905981 4666 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.905993 4666 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.906006 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtnfh\" (UniqueName: \"kubernetes.io/projected/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-kube-api-access-gtnfh\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.906019 4666 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:16 crc kubenswrapper[4666]: I1203 13:19:16.906030 4666 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfec4a43-8c2c-4dab-b3c1-2bc56e71d330-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:17 crc kubenswrapper[4666]: I1203 13:19:17.354474 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" event={"ID":"dfec4a43-8c2c-4dab-b3c1-2bc56e71d330","Type":"ContainerDied","Data":"1110405ae12e8a3bcf0cc10186c27889622d0c8d7efdd03c1708357f6ad7f563"} Dec 03 13:19:17 crc kubenswrapper[4666]: I1203 13:19:17.354549 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1110405ae12e8a3bcf0cc10186c27889622d0c8d7efdd03c1708357f6ad7f563" Dec 03 13:19:17 crc kubenswrapper[4666]: I1203 13:19:17.354611 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7" Dec 03 13:19:26 crc kubenswrapper[4666]: I1203 13:19:26.423575 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:19:26 crc kubenswrapper[4666]: E1203 13:19:26.424541 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.142158 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 03 13:19:32 crc kubenswrapper[4666]: E1203 13:19:32.146242 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.146288 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 03 13:19:32 crc kubenswrapper[4666]: E1203 13:19:32.146375 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="extract-content" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.146385 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="extract-content" Dec 03 13:19:32 crc kubenswrapper[4666]: E1203 13:19:32.146415 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="extract-utilities" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.146424 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="extract-utilities" Dec 03 13:19:32 crc kubenswrapper[4666]: E1203 13:19:32.146460 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="registry-server" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.146468 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="registry-server" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.162633 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfec4a43-8c2c-4dab-b3c1-2bc56e71d330" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.162680 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa856a8f-e2b1-46ae-bc18-6e217fd2e4d6" containerName="registry-server" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.164365 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.166729 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.167384 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.170589 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.170691 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.172884 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.192310 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.215820 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238293 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238338 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-run\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238354 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238371 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238385 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-dev\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238413 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-nvme\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238466 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238486 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238540 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238570 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-lib-modules\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238589 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238622 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85f2l\" (UniqueName: \"kubernetes.io/projected/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-kube-api-access-85f2l\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238645 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238663 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37430c8d-6678-44c5-a349-8cb94fbb9108-ceph\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238680 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-config-data-custom\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238700 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238727 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238747 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp9lx\" (UniqueName: \"kubernetes.io/projected/37430c8d-6678-44c5-a349-8cb94fbb9108-kube-api-access-qp9lx\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238768 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-run\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238781 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238796 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238833 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238847 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238869 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-scripts\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238890 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238917 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238940 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-config-data\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238955 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.238980 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-sys\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.239001 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.239043 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.239062 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340284 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-sys\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340330 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340374 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340396 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340426 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340450 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-run\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340464 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340483 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340477 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-sys\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340534 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-dev\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340499 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-dev\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340578 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-nvme\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340600 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340618 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340641 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340659 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-lib-modules\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340680 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340697 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85f2l\" (UniqueName: \"kubernetes.io/projected/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-kube-api-access-85f2l\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340714 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340729 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37430c8d-6678-44c5-a349-8cb94fbb9108-ceph\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340743 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-config-data-custom\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340762 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340787 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340806 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9lx\" (UniqueName: \"kubernetes.io/projected/37430c8d-6678-44c5-a349-8cb94fbb9108-kube-api-access-qp9lx\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340826 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-run\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340841 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340857 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340885 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340899 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340922 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-scripts\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340943 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340970 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.340985 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-config-data\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341004 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341309 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341394 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-run\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341424 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341635 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341689 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341865 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.341962 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342000 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-run\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342027 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342062 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342113 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-nvme\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342141 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342261 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342380 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-lib-modules\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342423 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342523 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342554 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.342610 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/37430c8d-6678-44c5-a349-8cb94fbb9108-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.350693 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-scripts\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.351048 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-config-data\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.352500 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-config-data-custom\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.352626 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.352718 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.352869 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.355459 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.355499 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37430c8d-6678-44c5-a349-8cb94fbb9108-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.359402 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37430c8d-6678-44c5-a349-8cb94fbb9108-ceph\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.362824 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85f2l\" (UniqueName: \"kubernetes.io/projected/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-kube-api-access-85f2l\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.364823 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp9lx\" (UniqueName: \"kubernetes.io/projected/37430c8d-6678-44c5-a349-8cb94fbb9108-kube-api-access-qp9lx\") pod \"cinder-backup-0\" (UID: \"37430c8d-6678-44c5-a349-8cb94fbb9108\") " pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.365253 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978def64-fbe5-4ce2-a2ab-f12bd95ef64a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"978def64-fbe5-4ce2-a2ab-f12bd95ef64a\") " pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.512709 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.523839 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.628901 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-rzxg8"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.630298 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.643684 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-rzxg8"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.716231 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-b7f3-account-create-update-f6m25"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.717501 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.720518 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.741289 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b7f3-account-create-update-f6m25"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.756181 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgxt6\" (UniqueName: \"kubernetes.io/projected/38105880-8017-49b9-a922-d936adc03946-kube-api-access-cgxt6\") pod \"manila-db-create-rzxg8\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.756263 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a40df46-c8e9-496a-af5c-3066bafd781b-operator-scripts\") pod \"manila-b7f3-account-create-update-f6m25\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.756298 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb2d2\" (UniqueName: \"kubernetes.io/projected/6a40df46-c8e9-496a-af5c-3066bafd781b-kube-api-access-kb2d2\") pod \"manila-b7f3-account-create-update-f6m25\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.756375 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38105880-8017-49b9-a922-d936adc03946-operator-scripts\") pod \"manila-db-create-rzxg8\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.808783 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5db5796857-lghnw"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.810327 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.812312 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.812943 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.813183 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.815406 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-vbnqg" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.846144 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5db5796857-lghnw"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.860932 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a40df46-c8e9-496a-af5c-3066bafd781b-operator-scripts\") pod \"manila-b7f3-account-create-update-f6m25\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.860986 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb2d2\" (UniqueName: \"kubernetes.io/projected/6a40df46-c8e9-496a-af5c-3066bafd781b-kube-api-access-kb2d2\") pod \"manila-b7f3-account-create-update-f6m25\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.861062 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38105880-8017-49b9-a922-d936adc03946-operator-scripts\") pod \"manila-db-create-rzxg8\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.861121 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgxt6\" (UniqueName: \"kubernetes.io/projected/38105880-8017-49b9-a922-d936adc03946-kube-api-access-cgxt6\") pod \"manila-db-create-rzxg8\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.862220 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a40df46-c8e9-496a-af5c-3066bafd781b-operator-scripts\") pod \"manila-b7f3-account-create-update-f6m25\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.863131 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38105880-8017-49b9-a922-d936adc03946-operator-scripts\") pod \"manila-db-create-rzxg8\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.902758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgxt6\" (UniqueName: \"kubernetes.io/projected/38105880-8017-49b9-a922-d936adc03946-kube-api-access-cgxt6\") pod \"manila-db-create-rzxg8\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.910989 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb2d2\" (UniqueName: \"kubernetes.io/projected/6a40df46-c8e9-496a-af5c-3066bafd781b-kube-api-access-kb2d2\") pod \"manila-b7f3-account-create-update-f6m25\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.915136 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.916637 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.920117 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.920785 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b6lz5" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.921397 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.922140 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.974137 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.974498 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ab5d2c81-de3b-409e-9dae-63770077a8b4-horizon-secret-key\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.974635 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab5d2c81-de3b-409e-9dae-63770077a8b4-logs\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.974701 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9w9s\" (UniqueName: \"kubernetes.io/projected/ab5d2c81-de3b-409e-9dae-63770077a8b4-kube-api-access-k9w9s\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.974770 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-config-data\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:32 crc kubenswrapper[4666]: I1203 13:19:32.974838 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-scripts\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.003908 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f477cf945-89g4r"] Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.005436 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.009247 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.044170 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.057179 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f477cf945-89g4r"] Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.066455 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.069669 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076718 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab5d2c81-de3b-409e-9dae-63770077a8b4-logs\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076760 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076784 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076828 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9w9s\" (UniqueName: \"kubernetes.io/projected/ab5d2c81-de3b-409e-9dae-63770077a8b4-kube-api-access-k9w9s\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076859 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076889 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-config-data\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076907 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076942 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-scripts\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.076981 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.077006 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-ceph\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.077024 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.077040 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ab5d2c81-de3b-409e-9dae-63770077a8b4-horizon-secret-key\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.077059 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtpj\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-kube-api-access-6gtpj\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.077107 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-logs\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.077556 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab5d2c81-de3b-409e-9dae-63770077a8b4-logs\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.079516 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-config-data\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.081122 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-scripts\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.086317 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.086470 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.092162 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ab5d2c81-de3b-409e-9dae-63770077a8b4-horizon-secret-key\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.106568 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.110553 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9w9s\" (UniqueName: \"kubernetes.io/projected/ab5d2c81-de3b-409e-9dae-63770077a8b4-kube-api-access-k9w9s\") pod \"horizon-5db5796857-lghnw\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.153689 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181192 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181242 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-logs\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181290 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181308 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-scripts\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181337 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181357 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-ceph\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181379 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181405 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-horizon-secret-key\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181420 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181437 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-config-data\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181463 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181486 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181501 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvx52\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-kube-api-access-bvx52\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181527 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181558 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181583 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-ceph\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181646 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181667 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtpj\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-kube-api-access-6gtpj\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181704 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-logs\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181728 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kghx\" (UniqueName: \"kubernetes.io/projected/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-kube-api-access-2kghx\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181746 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181774 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-logs\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.181800 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.182174 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.185710 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.187436 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-logs\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.193882 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-ceph\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.198433 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.200351 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.200452 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.214414 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtpj\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-kube-api-access-6gtpj\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.234160 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283632 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-logs\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283684 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-logs\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283731 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283751 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-scripts\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283790 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-ceph\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283823 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-horizon-secret-key\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283841 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283858 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-config-data\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283896 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283919 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283938 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvx52\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-kube-api-access-bvx52\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.283972 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.284054 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kghx\" (UniqueName: \"kubernetes.io/projected/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-kube-api-access-2kghx\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.284069 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.284448 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-logs\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.287725 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-logs\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.288275 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.288668 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-config-data\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.289075 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-scripts\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.289801 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.293742 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.302084 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.303368 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-horizon-secret-key\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.305036 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-ceph\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.313009 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvx52\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-kube-api-access-bvx52\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.317271 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.324521 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kghx\" (UniqueName: \"kubernetes.io/projected/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-kube-api-access-2kghx\") pod \"horizon-5f477cf945-89g4r\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.327284 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.329733 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.338699 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.339565 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.408805 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.492580 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.515619 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"37430c8d-6678-44c5-a349-8cb94fbb9108","Type":"ContainerStarted","Data":"f85b9c61959fa696f89bc861e86c88aa2d2cfe5915b7e81601fa15862958f274"} Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.553763 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.590400 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.781163 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-rzxg8"] Dec 03 13:19:33 crc kubenswrapper[4666]: W1203 13:19:33.791163 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a40df46_c8e9_496a_af5c_3066bafd781b.slice/crio-03fc3f67d96b0df6e8a4ed84b8d3d3d2ce5709598eb8419cc032f6773b2b14e8 WatchSource:0}: Error finding container 03fc3f67d96b0df6e8a4ed84b8d3d3d2ce5709598eb8419cc032f6773b2b14e8: Status 404 returned error can't find the container with id 03fc3f67d96b0df6e8a4ed84b8d3d3d2ce5709598eb8419cc032f6773b2b14e8 Dec 03 13:19:33 crc kubenswrapper[4666]: I1203 13:19:33.799056 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b7f3-account-create-update-f6m25"] Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.036636 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5db5796857-lghnw"] Dec 03 13:19:34 crc kubenswrapper[4666]: W1203 13:19:34.042135 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab5d2c81_de3b_409e_9dae_63770077a8b4.slice/crio-94adce438f7d6ea77839d3a3566149108369fd74e9565770a5e55dfd43c51cce WatchSource:0}: Error finding container 94adce438f7d6ea77839d3a3566149108369fd74e9565770a5e55dfd43c51cce: Status 404 returned error can't find the container with id 94adce438f7d6ea77839d3a3566149108369fd74e9565770a5e55dfd43c51cce Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.178562 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f477cf945-89g4r"] Dec 03 13:19:34 crc kubenswrapper[4666]: W1203 13:19:34.178704 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ebdb93b_7a58_46b5_82d6_28c35c6ff575.slice/crio-6c163b3b0b702deaa1bc68983e0857b1b26183c49b3c113c45cf8078fdcc1f49 WatchSource:0}: Error finding container 6c163b3b0b702deaa1bc68983e0857b1b26183c49b3c113c45cf8078fdcc1f49: Status 404 returned error can't find the container with id 6c163b3b0b702deaa1bc68983e0857b1b26183c49b3c113c45cf8078fdcc1f49 Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.211369 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:34 crc kubenswrapper[4666]: W1203 13:19:34.288920 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2573e227_e982_45eb_a3ee_27fe6835b056.slice/crio-95c5569bf65d0ead959167a6d4b133aea69eed651d6a73699af42d88218e02c7 WatchSource:0}: Error finding container 95c5569bf65d0ead959167a6d4b133aea69eed651d6a73699af42d88218e02c7: Status 404 returned error can't find the container with id 95c5569bf65d0ead959167a6d4b133aea69eed651d6a73699af42d88218e02c7 Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.289634 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:34 crc kubenswrapper[4666]: W1203 13:19:34.357472 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b48dba0_97b3_43a0_bbde_a6e73a064d40.slice/crio-f5c7f51a834319cc6c04f7da31105fa3ae096554e616c3f172b05d5c4dce06db WatchSource:0}: Error finding container f5c7f51a834319cc6c04f7da31105fa3ae096554e616c3f172b05d5c4dce06db: Status 404 returned error can't find the container with id f5c7f51a834319cc6c04f7da31105fa3ae096554e616c3f172b05d5c4dce06db Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.548427 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b48dba0-97b3-43a0-bbde-a6e73a064d40","Type":"ContainerStarted","Data":"f5c7f51a834319cc6c04f7da31105fa3ae096554e616c3f172b05d5c4dce06db"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.551251 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db5796857-lghnw" event={"ID":"ab5d2c81-de3b-409e-9dae-63770077a8b4","Type":"ContainerStarted","Data":"94adce438f7d6ea77839d3a3566149108369fd74e9565770a5e55dfd43c51cce"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.558172 4666 generic.go:334] "Generic (PLEG): container finished" podID="38105880-8017-49b9-a922-d936adc03946" containerID="06854ba650ff3d7b14f2a8d2bc15ec70324ea120d741bbe4c27aa14d3b77b3ae" exitCode=0 Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.558737 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-rzxg8" event={"ID":"38105880-8017-49b9-a922-d936adc03946","Type":"ContainerDied","Data":"06854ba650ff3d7b14f2a8d2bc15ec70324ea120d741bbe4c27aa14d3b77b3ae"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.558773 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-rzxg8" event={"ID":"38105880-8017-49b9-a922-d936adc03946","Type":"ContainerStarted","Data":"0c263e1cd229054fb5883b3dc00de1aae27e4645b173424c031baab1ddebc6ec"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.564694 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2573e227-e982-45eb-a3ee-27fe6835b056","Type":"ContainerStarted","Data":"95c5569bf65d0ead959167a6d4b133aea69eed651d6a73699af42d88218e02c7"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.566155 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"978def64-fbe5-4ce2-a2ab-f12bd95ef64a","Type":"ContainerStarted","Data":"5b6cf01b79885e93ced8bca8969d0167a5702ad4aa586a1f2abb34fb2f80ae8c"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.569264 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f477cf945-89g4r" event={"ID":"8ebdb93b-7a58-46b5-82d6-28c35c6ff575","Type":"ContainerStarted","Data":"6c163b3b0b702deaa1bc68983e0857b1b26183c49b3c113c45cf8078fdcc1f49"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.575297 4666 generic.go:334] "Generic (PLEG): container finished" podID="6a40df46-c8e9-496a-af5c-3066bafd781b" containerID="459b69c0e6ac76c45e4b716dac7238ac9ba1da345d348803f5987c0f4a7455ca" exitCode=0 Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.575334 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b7f3-account-create-update-f6m25" event={"ID":"6a40df46-c8e9-496a-af5c-3066bafd781b","Type":"ContainerDied","Data":"459b69c0e6ac76c45e4b716dac7238ac9ba1da345d348803f5987c0f4a7455ca"} Dec 03 13:19:34 crc kubenswrapper[4666]: I1203 13:19:34.575359 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b7f3-account-create-update-f6m25" event={"ID":"6a40df46-c8e9-496a-af5c-3066bafd781b","Type":"ContainerStarted","Data":"03fc3f67d96b0df6e8a4ed84b8d3d3d2ce5709598eb8419cc032f6773b2b14e8"} Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.366855 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5db5796857-lghnw"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.391472 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-686f87d7cd-k4tjz"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.393015 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.404258 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.424066 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.480424 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686f87d7cd-k4tjz"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.508204 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f477cf945-89g4r"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.516074 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae41f444-5132-4df8-80dc-c80a6aea99f8-logs\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.516144 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-combined-ca-bundle\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.516164 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-secret-key\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.516206 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-config-data\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.516237 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmlmv\" (UniqueName: \"kubernetes.io/projected/ae41f444-5132-4df8-80dc-c80a6aea99f8-kube-api-access-rmlmv\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.516258 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-tls-certs\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.516292 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-scripts\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.525150 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6bd58698c4-v4zw4"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.526835 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.550235 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.572742 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bd58698c4-v4zw4"] Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618413 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae41f444-5132-4df8-80dc-c80a6aea99f8-logs\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618464 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbf56\" (UniqueName: \"kubernetes.io/projected/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-kube-api-access-xbf56\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618485 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-scripts\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618510 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-combined-ca-bundle\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618528 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-config-data\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618545 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-secret-key\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618585 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-config-data\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618606 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-horizon-tls-certs\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618624 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-horizon-secret-key\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618639 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-logs\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618656 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmlmv\" (UniqueName: \"kubernetes.io/projected/ae41f444-5132-4df8-80dc-c80a6aea99f8-kube-api-access-rmlmv\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618678 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-tls-certs\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618704 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-combined-ca-bundle\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.618724 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-scripts\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.619589 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae41f444-5132-4df8-80dc-c80a6aea99f8-logs\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.619832 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-scripts\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.620541 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-config-data\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.633459 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"37430c8d-6678-44c5-a349-8cb94fbb9108","Type":"ContainerStarted","Data":"51e9208fef97c0807e63148c03d4bad04adb738675f5aea9c81b9e8d173805d5"} Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.633505 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"37430c8d-6678-44c5-a349-8cb94fbb9108","Type":"ContainerStarted","Data":"bdc622389a6f2c3ed821cd63425a7b19994f74e225d1ff58fa43d86e85b9a4d1"} Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.657399 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"978def64-fbe5-4ce2-a2ab-f12bd95ef64a","Type":"ContainerStarted","Data":"b5f040334016673326510e5a9b2319c0cf9229ef4151a0f4a8819ec18274f2a6"} Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.659613 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.932997205 podStartE2EDuration="3.659596865s" podCreationTimestamp="2025-12-03 13:19:32 +0000 UTC" firstStartedPulling="2025-12-03 13:19:33.504979354 +0000 UTC m=+3962.349940395" lastFinishedPulling="2025-12-03 13:19:34.231579004 +0000 UTC m=+3963.076540055" observedRunningTime="2025-12-03 13:19:35.65681669 +0000 UTC m=+3964.501777751" watchObservedRunningTime="2025-12-03 13:19:35.659596865 +0000 UTC m=+3964.504557916" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.695718 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-secret-key\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.695934 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-combined-ca-bundle\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.703683 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmlmv\" (UniqueName: \"kubernetes.io/projected/ae41f444-5132-4df8-80dc-c80a6aea99f8-kube-api-access-rmlmv\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.703752 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-tls-certs\") pod \"horizon-686f87d7cd-k4tjz\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721028 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbf56\" (UniqueName: \"kubernetes.io/projected/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-kube-api-access-xbf56\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721105 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-scripts\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721138 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-config-data\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721220 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-horizon-tls-certs\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721249 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-horizon-secret-key\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721270 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-logs\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721350 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-combined-ca-bundle\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.721665 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-logs\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.723141 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-scripts\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.723549 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-config-data\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.725743 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-horizon-secret-key\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.727844 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-combined-ca-bundle\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.741050 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-horizon-tls-certs\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.749533 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbf56\" (UniqueName: \"kubernetes.io/projected/a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd-kube-api-access-xbf56\") pod \"horizon-6bd58698c4-v4zw4\" (UID: \"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd\") " pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.912848 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:35 crc kubenswrapper[4666]: I1203 13:19:35.937947 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.049814 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.139602 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38105880-8017-49b9-a922-d936adc03946-operator-scripts\") pod \"38105880-8017-49b9-a922-d936adc03946\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.139770 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgxt6\" (UniqueName: \"kubernetes.io/projected/38105880-8017-49b9-a922-d936adc03946-kube-api-access-cgxt6\") pod \"38105880-8017-49b9-a922-d936adc03946\" (UID: \"38105880-8017-49b9-a922-d936adc03946\") " Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.140522 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38105880-8017-49b9-a922-d936adc03946-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38105880-8017-49b9-a922-d936adc03946" (UID: "38105880-8017-49b9-a922-d936adc03946"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.143431 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38105880-8017-49b9-a922-d936adc03946-kube-api-access-cgxt6" (OuterVolumeSpecName: "kube-api-access-cgxt6") pod "38105880-8017-49b9-a922-d936adc03946" (UID: "38105880-8017-49b9-a922-d936adc03946"). InnerVolumeSpecName "kube-api-access-cgxt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.224893 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.243951 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a40df46-c8e9-496a-af5c-3066bafd781b-operator-scripts\") pod \"6a40df46-c8e9-496a-af5c-3066bafd781b\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.244009 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb2d2\" (UniqueName: \"kubernetes.io/projected/6a40df46-c8e9-496a-af5c-3066bafd781b-kube-api-access-kb2d2\") pod \"6a40df46-c8e9-496a-af5c-3066bafd781b\" (UID: \"6a40df46-c8e9-496a-af5c-3066bafd781b\") " Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.244684 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgxt6\" (UniqueName: \"kubernetes.io/projected/38105880-8017-49b9-a922-d936adc03946-kube-api-access-cgxt6\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.244701 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38105880-8017-49b9-a922-d936adc03946-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.247843 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a40df46-c8e9-496a-af5c-3066bafd781b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a40df46-c8e9-496a-af5c-3066bafd781b" (UID: "6a40df46-c8e9-496a-af5c-3066bafd781b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.264641 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a40df46-c8e9-496a-af5c-3066bafd781b-kube-api-access-kb2d2" (OuterVolumeSpecName: "kube-api-access-kb2d2") pod "6a40df46-c8e9-496a-af5c-3066bafd781b" (UID: "6a40df46-c8e9-496a-af5c-3066bafd781b"). InnerVolumeSpecName "kube-api-access-kb2d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.347200 4666 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a40df46-c8e9-496a-af5c-3066bafd781b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.347236 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb2d2\" (UniqueName: \"kubernetes.io/projected/6a40df46-c8e9-496a-af5c-3066bafd781b-kube-api-access-kb2d2\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.616508 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bd58698c4-v4zw4"] Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.663724 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686f87d7cd-k4tjz"] Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.678397 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b7f3-account-create-update-f6m25" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.678574 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b7f3-account-create-update-f6m25" event={"ID":"6a40df46-c8e9-496a-af5c-3066bafd781b","Type":"ContainerDied","Data":"03fc3f67d96b0df6e8a4ed84b8d3d3d2ce5709598eb8419cc032f6773b2b14e8"} Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.678613 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03fc3f67d96b0df6e8a4ed84b8d3d3d2ce5709598eb8419cc032f6773b2b14e8" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.681657 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b48dba0-97b3-43a0-bbde-a6e73a064d40","Type":"ContainerStarted","Data":"83254faf4d715d8729b12bc4ecc5e7b2b9d1d9f11854dfca71daed91eef8f03f"} Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.704398 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bd58698c4-v4zw4" event={"ID":"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd","Type":"ContainerStarted","Data":"278897988c6007b255849433722732ce306916df6c09b6e34eb6ffb6526c478b"} Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.708794 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-rzxg8" event={"ID":"38105880-8017-49b9-a922-d936adc03946","Type":"ContainerDied","Data":"0c263e1cd229054fb5883b3dc00de1aae27e4645b173424c031baab1ddebc6ec"} Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.708828 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c263e1cd229054fb5883b3dc00de1aae27e4645b173424c031baab1ddebc6ec" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.708883 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rzxg8" Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.724135 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2573e227-e982-45eb-a3ee-27fe6835b056","Type":"ContainerStarted","Data":"508b344bb8adb290403da3ac99aa6a2cd94ac5722e5b907535271da9338fb75e"} Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.743390 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"978def64-fbe5-4ce2-a2ab-f12bd95ef64a","Type":"ContainerStarted","Data":"69f66ba62368ebe404d124811abbafdc8ef6365f606b65dbbbab5c9be90b14d8"} Dec 03 13:19:36 crc kubenswrapper[4666]: I1203 13:19:36.778260 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.886719041 podStartE2EDuration="4.77824374s" podCreationTimestamp="2025-12-03 13:19:32 +0000 UTC" firstStartedPulling="2025-12-03 13:19:33.642319091 +0000 UTC m=+3962.487280142" lastFinishedPulling="2025-12-03 13:19:34.53384379 +0000 UTC m=+3963.378804841" observedRunningTime="2025-12-03 13:19:36.775187597 +0000 UTC m=+3965.620148668" watchObservedRunningTime="2025-12-03 13:19:36.77824374 +0000 UTC m=+3965.623204791" Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.512923 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.524218 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.755745 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b48dba0-97b3-43a0-bbde-a6e73a064d40","Type":"ContainerStarted","Data":"d06c93859d19e74426efad1fa6e0b103f13f51466057270c1554ef188ce28316"} Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.755826 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-log" containerID="cri-o://83254faf4d715d8729b12bc4ecc5e7b2b9d1d9f11854dfca71daed91eef8f03f" gracePeriod=30 Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.755949 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-httpd" containerID="cri-o://d06c93859d19e74426efad1fa6e0b103f13f51466057270c1554ef188ce28316" gracePeriod=30 Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.769070 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2573e227-e982-45eb-a3ee-27fe6835b056","Type":"ContainerStarted","Data":"46c9ad3f979d0816933f707d296656da69d68266187dfcb3b5c0cbcc18793c08"} Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.769305 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-log" containerID="cri-o://508b344bb8adb290403da3ac99aa6a2cd94ac5722e5b907535271da9338fb75e" gracePeriod=30 Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.769638 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-httpd" containerID="cri-o://46c9ad3f979d0816933f707d296656da69d68266187dfcb3b5c0cbcc18793c08" gracePeriod=30 Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.788209 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686f87d7cd-k4tjz" event={"ID":"ae41f444-5132-4df8-80dc-c80a6aea99f8","Type":"ContainerStarted","Data":"e0f741fbf25ac29b7108847f15a0ed1bde33e9f7d6cb16205d8d1cc40400275f"} Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.793219 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.793202153 podStartE2EDuration="5.793202153s" podCreationTimestamp="2025-12-03 13:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:19:37.778868567 +0000 UTC m=+3966.623829628" watchObservedRunningTime="2025-12-03 13:19:37.793202153 +0000 UTC m=+3966.638163194" Dec 03 13:19:37 crc kubenswrapper[4666]: I1203 13:19:37.816517 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.81649735 podStartE2EDuration="5.81649735s" podCreationTimestamp="2025-12-03 13:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:19:37.803673484 +0000 UTC m=+3966.648634535" watchObservedRunningTime="2025-12-03 13:19:37.81649735 +0000 UTC m=+3966.661458401" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.130244 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-m76w2"] Dec 03 13:19:38 crc kubenswrapper[4666]: E1203 13:19:38.131259 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38105880-8017-49b9-a922-d936adc03946" containerName="mariadb-database-create" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.131278 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="38105880-8017-49b9-a922-d936adc03946" containerName="mariadb-database-create" Dec 03 13:19:38 crc kubenswrapper[4666]: E1203 13:19:38.131293 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a40df46-c8e9-496a-af5c-3066bafd781b" containerName="mariadb-account-create-update" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.131301 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a40df46-c8e9-496a-af5c-3066bafd781b" containerName="mariadb-account-create-update" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.131566 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a40df46-c8e9-496a-af5c-3066bafd781b" containerName="mariadb-account-create-update" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.131583 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="38105880-8017-49b9-a922-d936adc03946" containerName="mariadb-database-create" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.132425 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.134644 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-r9x5z" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.134827 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.139050 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-m76w2"] Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.219834 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-job-config-data\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.219940 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-config-data\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.219977 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh9lt\" (UniqueName: \"kubernetes.io/projected/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-kube-api-access-dh9lt\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.220037 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-combined-ca-bundle\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.322267 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-combined-ca-bundle\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.322450 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-job-config-data\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.322504 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-config-data\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.322538 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh9lt\" (UniqueName: \"kubernetes.io/projected/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-kube-api-access-dh9lt\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.331031 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-config-data\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.331172 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-combined-ca-bundle\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.331560 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-job-config-data\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.340591 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh9lt\" (UniqueName: \"kubernetes.io/projected/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-kube-api-access-dh9lt\") pod \"manila-db-sync-m76w2\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.426997 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:19:38 crc kubenswrapper[4666]: E1203 13:19:38.427719 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.606685 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m76w2" Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.811608 4666 generic.go:334] "Generic (PLEG): container finished" podID="2573e227-e982-45eb-a3ee-27fe6835b056" containerID="46c9ad3f979d0816933f707d296656da69d68266187dfcb3b5c0cbcc18793c08" exitCode=0 Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.811643 4666 generic.go:334] "Generic (PLEG): container finished" podID="2573e227-e982-45eb-a3ee-27fe6835b056" containerID="508b344bb8adb290403da3ac99aa6a2cd94ac5722e5b907535271da9338fb75e" exitCode=143 Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.811710 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2573e227-e982-45eb-a3ee-27fe6835b056","Type":"ContainerDied","Data":"46c9ad3f979d0816933f707d296656da69d68266187dfcb3b5c0cbcc18793c08"} Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.811766 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2573e227-e982-45eb-a3ee-27fe6835b056","Type":"ContainerDied","Data":"508b344bb8adb290403da3ac99aa6a2cd94ac5722e5b907535271da9338fb75e"} Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.814907 4666 generic.go:334] "Generic (PLEG): container finished" podID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerID="d06c93859d19e74426efad1fa6e0b103f13f51466057270c1554ef188ce28316" exitCode=0 Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.814936 4666 generic.go:334] "Generic (PLEG): container finished" podID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerID="83254faf4d715d8729b12bc4ecc5e7b2b9d1d9f11854dfca71daed91eef8f03f" exitCode=143 Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.815919 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b48dba0-97b3-43a0-bbde-a6e73a064d40","Type":"ContainerDied","Data":"d06c93859d19e74426efad1fa6e0b103f13f51466057270c1554ef188ce28316"} Dec 03 13:19:38 crc kubenswrapper[4666]: I1203 13:19:38.815966 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b48dba0-97b3-43a0-bbde-a6e73a064d40","Type":"ContainerDied","Data":"83254faf4d715d8729b12bc4ecc5e7b2b9d1d9f11854dfca71daed91eef8f03f"} Dec 03 13:19:42 crc kubenswrapper[4666]: I1203 13:19:42.740498 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 03 13:19:42 crc kubenswrapper[4666]: I1203 13:19:42.794886 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.805739 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.809478 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.885499 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b48dba0-97b3-43a0-bbde-a6e73a064d40","Type":"ContainerDied","Data":"f5c7f51a834319cc6c04f7da31105fa3ae096554e616c3f172b05d5c4dce06db"} Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.885554 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.885571 4666 scope.go:117] "RemoveContainer" containerID="d06c93859d19e74426efad1fa6e0b103f13f51466057270c1554ef188ce28316" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.906217 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2573e227-e982-45eb-a3ee-27fe6835b056","Type":"ContainerDied","Data":"95c5569bf65d0ead959167a6d4b133aea69eed651d6a73699af42d88218e02c7"} Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.906318 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.932881 4666 scope.go:117] "RemoveContainer" containerID="83254faf4d715d8729b12bc4ecc5e7b2b9d1d9f11854dfca71daed91eef8f03f" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.990888 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-internal-tls-certs\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.990944 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-logs\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.990971 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-httpd-run\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.990991 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-ceph\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991005 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-combined-ca-bundle\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991025 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvx52\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-kube-api-access-bvx52\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991053 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-scripts\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991103 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-httpd-run\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991146 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-scripts\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991189 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-public-tls-certs\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991204 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-logs\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991238 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-combined-ca-bundle\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991295 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-ceph\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991317 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-config-data\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991376 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2573e227-e982-45eb-a3ee-27fe6835b056\" (UID: \"2573e227-e982-45eb-a3ee-27fe6835b056\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991430 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-config-data\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991493 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gtpj\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-kube-api-access-6gtpj\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.991520 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\" (UID: \"5b48dba0-97b3-43a0-bbde-a6e73a064d40\") " Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.993344 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-logs" (OuterVolumeSpecName: "logs") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.993517 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-logs" (OuterVolumeSpecName: "logs") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.996425 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.998487 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-ceph" (OuterVolumeSpecName: "ceph") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.998747 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:19:43 crc kubenswrapper[4666]: I1203 13:19:43.999743 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.002023 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-scripts" (OuterVolumeSpecName: "scripts") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.002446 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-scripts" (OuterVolumeSpecName: "scripts") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.003128 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-kube-api-access-bvx52" (OuterVolumeSpecName: "kube-api-access-bvx52") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "kube-api-access-bvx52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.003239 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-ceph" (OuterVolumeSpecName: "ceph") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.003677 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.005765 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-kube-api-access-6gtpj" (OuterVolumeSpecName: "kube-api-access-6gtpj") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "kube-api-access-6gtpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.017315 4666 scope.go:117] "RemoveContainer" containerID="46c9ad3f979d0816933f707d296656da69d68266187dfcb3b5c0cbcc18793c08" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.027909 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.047763 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-config-data" (OuterVolumeSpecName: "config-data") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.059036 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.077318 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.082323 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2573e227-e982-45eb-a3ee-27fe6835b056" (UID: "2573e227-e982-45eb-a3ee-27fe6835b056"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093595 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093629 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093661 4666 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093676 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gtpj\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-kube-api-access-6gtpj\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093693 4666 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093705 4666 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093716 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-logs\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093726 4666 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b48dba0-97b3-43a0-bbde-a6e73a064d40-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093735 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b48dba0-97b3-43a0-bbde-a6e73a064d40-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093743 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093751 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvx52\" (UniqueName: \"kubernetes.io/projected/2573e227-e982-45eb-a3ee-27fe6835b056-kube-api-access-bvx52\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093759 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093767 4666 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093777 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093795 4666 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093809 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2573e227-e982-45eb-a3ee-27fe6835b056-logs\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.093821 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2573e227-e982-45eb-a3ee-27fe6835b056-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.096202 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-config-data" (OuterVolumeSpecName: "config-data") pod "5b48dba0-97b3-43a0-bbde-a6e73a064d40" (UID: "5b48dba0-97b3-43a0-bbde-a6e73a064d40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.116278 4666 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.117547 4666 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.195175 4666 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.195211 4666 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.195220 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b48dba0-97b3-43a0-bbde-a6e73a064d40-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.211438 4666 scope.go:117] "RemoveContainer" containerID="508b344bb8adb290403da3ac99aa6a2cd94ac5722e5b907535271da9338fb75e" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.272798 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.322236 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.363308 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-m76w2"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.388877 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: E1203 13:19:44.389487 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-httpd" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.389511 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-httpd" Dec 03 13:19:44 crc kubenswrapper[4666]: E1203 13:19:44.389544 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-httpd" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.389552 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-httpd" Dec 03 13:19:44 crc kubenswrapper[4666]: E1203 13:19:44.389563 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-log" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.389570 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-log" Dec 03 13:19:44 crc kubenswrapper[4666]: E1203 13:19:44.389597 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-log" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.389605 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-log" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.402930 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-httpd" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.402986 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-log" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.403015 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" containerName="glance-log" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.403029 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" containerName="glance-httpd" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.443195 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.443244 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.444775 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.449775 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.449888 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.454939 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.455190 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b6lz5" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.518142 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.547731 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.549355 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.561162 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.561379 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.565261 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0db6845-8502-4fa7-acdf-20c2395ca177-ceph\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.565471 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0db6845-8502-4fa7-acdf-20c2395ca177-logs\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.565588 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.566103 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.566269 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0db6845-8502-4fa7-acdf-20c2395ca177-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.566544 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.566806 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.566835 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwzd\" (UniqueName: \"kubernetes.io/projected/f0db6845-8502-4fa7-acdf-20c2395ca177-kube-api-access-9bwzd\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.566974 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.568236 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.669909 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670224 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670245 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670262 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670305 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw2qk\" (UniqueName: \"kubernetes.io/projected/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-kube-api-access-mw2qk\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670750 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0db6845-8502-4fa7-acdf-20c2395ca177-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670789 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670806 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670846 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670879 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670949 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670969 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.670988 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwzd\" (UniqueName: \"kubernetes.io/projected/f0db6845-8502-4fa7-acdf-20c2395ca177-kube-api-access-9bwzd\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671040 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671066 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671146 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671178 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0db6845-8502-4fa7-acdf-20c2395ca177-ceph\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671193 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0db6845-8502-4fa7-acdf-20c2395ca177-logs\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671539 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0db6845-8502-4fa7-acdf-20c2395ca177-logs\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671650 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0db6845-8502-4fa7-acdf-20c2395ca177-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.671758 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.681854 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.682988 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0db6845-8502-4fa7-acdf-20c2395ca177-ceph\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.683258 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.684998 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.688044 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0db6845-8502-4fa7-acdf-20c2395ca177-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.691452 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwzd\" (UniqueName: \"kubernetes.io/projected/f0db6845-8502-4fa7-acdf-20c2395ca177-kube-api-access-9bwzd\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.766262 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"f0db6845-8502-4fa7-acdf-20c2395ca177\") " pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.774706 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.774786 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.774875 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.774896 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.775189 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.775201 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.775338 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw2qk\" (UniqueName: \"kubernetes.io/projected/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-kube-api-access-mw2qk\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.775441 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.775514 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.775607 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.775668 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.776041 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.778580 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.778782 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.779490 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.782815 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.793441 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.852447 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.891727 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw2qk\" (UniqueName: \"kubernetes.io/projected/c8c0a38f-bf71-4907-ba78-bef2e7227dc6-kube-api-access-mw2qk\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.905210 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8c0a38f-bf71-4907-ba78-bef2e7227dc6\") " pod="openstack/glance-default-internal-api-0" Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.918580 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686f87d7cd-k4tjz" event={"ID":"ae41f444-5132-4df8-80dc-c80a6aea99f8","Type":"ContainerStarted","Data":"7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6"} Dec 03 13:19:44 crc kubenswrapper[4666]: I1203 13:19:44.922297 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m76w2" event={"ID":"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3","Type":"ContainerStarted","Data":"4545b9100359ac8b572d546e576865a566e11e2e62852f659e8df1e5ab8d067f"} Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.186695 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.434966 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2573e227-e982-45eb-a3ee-27fe6835b056" path="/var/lib/kubelet/pods/2573e227-e982-45eb-a3ee-27fe6835b056/volumes" Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.435993 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b48dba0-97b3-43a0-bbde-a6e73a064d40" path="/var/lib/kubelet/pods/5b48dba0-97b3-43a0-bbde-a6e73a064d40/volumes" Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.436644 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 13:19:45 crc kubenswrapper[4666]: W1203 13:19:45.595515 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0db6845_8502_4fa7_acdf_20c2395ca177.slice/crio-310d8b17bdbc5297a96bd10cc76c75bc8e88b6d6ea4b6b225aa4eb5fc93e0ee2 WatchSource:0}: Error finding container 310d8b17bdbc5297a96bd10cc76c75bc8e88b6d6ea4b6b225aa4eb5fc93e0ee2: Status 404 returned error can't find the container with id 310d8b17bdbc5297a96bd10cc76c75bc8e88b6d6ea4b6b225aa4eb5fc93e0ee2 Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.946813 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db5796857-lghnw" event={"ID":"ab5d2c81-de3b-409e-9dae-63770077a8b4","Type":"ContainerStarted","Data":"202fc9ef7da1958d2e93e10fb94a83def433ab7e4a03be4f4511ca123b5c7da4"} Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.968831 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f477cf945-89g4r" event={"ID":"8ebdb93b-7a58-46b5-82d6-28c35c6ff575","Type":"ContainerStarted","Data":"b3dfc8d2b922aa5ba9a47c277fe1e56d0a399b83fa118f918b654ba1f9b303fa"} Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.981963 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686f87d7cd-k4tjz" event={"ID":"ae41f444-5132-4df8-80dc-c80a6aea99f8","Type":"ContainerStarted","Data":"40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf"} Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.989153 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bd58698c4-v4zw4" event={"ID":"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd","Type":"ContainerStarted","Data":"a09b56bd1cd1a3d131b8487af6992bdf5034a8240169546d1170e3702f7ea5e8"} Dec 03 13:19:45 crc kubenswrapper[4666]: I1203 13:19:45.991698 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0db6845-8502-4fa7-acdf-20c2395ca177","Type":"ContainerStarted","Data":"310d8b17bdbc5297a96bd10cc76c75bc8e88b6d6ea4b6b225aa4eb5fc93e0ee2"} Dec 03 13:19:46 crc kubenswrapper[4666]: I1203 13:19:46.009818 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-686f87d7cd-k4tjz" podStartSLOduration=3.8793698709999997 podStartE2EDuration="11.009782372s" podCreationTimestamp="2025-12-03 13:19:35 +0000 UTC" firstStartedPulling="2025-12-03 13:19:36.708665006 +0000 UTC m=+3965.553626057" lastFinishedPulling="2025-12-03 13:19:43.839077507 +0000 UTC m=+3972.684038558" observedRunningTime="2025-12-03 13:19:46.003278297 +0000 UTC m=+3974.848239358" watchObservedRunningTime="2025-12-03 13:19:46.009782372 +0000 UTC m=+3974.854743413" Dec 03 13:19:46 crc kubenswrapper[4666]: I1203 13:19:46.165682 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 13:19:46 crc kubenswrapper[4666]: W1203 13:19:46.189901 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8c0a38f_bf71_4907_ba78_bef2e7227dc6.slice/crio-b057239039b29947ad80b82e4abc59a1d68361ec871c3981b2f6df123a7ba0c1 WatchSource:0}: Error finding container b057239039b29947ad80b82e4abc59a1d68361ec871c3981b2f6df123a7ba0c1: Status 404 returned error can't find the container with id b057239039b29947ad80b82e4abc59a1d68361ec871c3981b2f6df123a7ba0c1 Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.011805 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db5796857-lghnw" event={"ID":"ab5d2c81-de3b-409e-9dae-63770077a8b4","Type":"ContainerStarted","Data":"061b4fe81081a4b9cda4d16240d3ca66a498b11aacbe729591d2342a3b6da6f0"} Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.012201 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5db5796857-lghnw" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon-log" containerID="cri-o://202fc9ef7da1958d2e93e10fb94a83def433ab7e4a03be4f4511ca123b5c7da4" gracePeriod=30 Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.012638 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5db5796857-lghnw" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon" containerID="cri-o://061b4fe81081a4b9cda4d16240d3ca66a498b11aacbe729591d2342a3b6da6f0" gracePeriod=30 Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.018378 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f477cf945-89g4r" event={"ID":"8ebdb93b-7a58-46b5-82d6-28c35c6ff575","Type":"ContainerStarted","Data":"c75c1562d8b8e1c742f915b9019050855a4067b73c19f6a2c011c2c08122be41"} Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.018474 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f477cf945-89g4r" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon-log" containerID="cri-o://b3dfc8d2b922aa5ba9a47c277fe1e56d0a399b83fa118f918b654ba1f9b303fa" gracePeriod=30 Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.018584 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f477cf945-89g4r" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon" containerID="cri-o://c75c1562d8b8e1c742f915b9019050855a4067b73c19f6a2c011c2c08122be41" gracePeriod=30 Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.022423 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bd58698c4-v4zw4" event={"ID":"a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd","Type":"ContainerStarted","Data":"e241a3a170c89ab3257f91cf60a18e65cf5c1b151d2925c62e15a36d9b6baf20"} Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.026719 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8c0a38f-bf71-4907-ba78-bef2e7227dc6","Type":"ContainerStarted","Data":"84bdade3512970937e08bf6e662a3f5936398f4fda54be68b8162aec75ca7bcd"} Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.026774 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8c0a38f-bf71-4907-ba78-bef2e7227dc6","Type":"ContainerStarted","Data":"b057239039b29947ad80b82e4abc59a1d68361ec871c3981b2f6df123a7ba0c1"} Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.035075 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0db6845-8502-4fa7-acdf-20c2395ca177","Type":"ContainerStarted","Data":"4ef81971a12025ea9301e88d645257ae8eb9969d824c35cd5d70df33dd3d9491"} Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.040263 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5db5796857-lghnw" podStartSLOduration=5.224060271 podStartE2EDuration="15.040241842s" podCreationTimestamp="2025-12-03 13:19:32 +0000 UTC" firstStartedPulling="2025-12-03 13:19:34.046111471 +0000 UTC m=+3962.891072532" lastFinishedPulling="2025-12-03 13:19:43.862293052 +0000 UTC m=+3972.707254103" observedRunningTime="2025-12-03 13:19:47.039327047 +0000 UTC m=+3975.884288098" watchObservedRunningTime="2025-12-03 13:19:47.040241842 +0000 UTC m=+3975.885202893" Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.064156 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5f477cf945-89g4r" podStartSLOduration=5.4173268740000005 podStartE2EDuration="15.064007552s" podCreationTimestamp="2025-12-03 13:19:32 +0000 UTC" firstStartedPulling="2025-12-03 13:19:34.215617994 +0000 UTC m=+3963.060579045" lastFinishedPulling="2025-12-03 13:19:43.862298672 +0000 UTC m=+3972.707259723" observedRunningTime="2025-12-03 13:19:47.056377836 +0000 UTC m=+3975.901338917" watchObservedRunningTime="2025-12-03 13:19:47.064007552 +0000 UTC m=+3975.908968603" Dec 03 13:19:47 crc kubenswrapper[4666]: I1203 13:19:47.082240 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6bd58698c4-v4zw4" podStartSLOduration=4.824221617 podStartE2EDuration="12.082218762s" podCreationTimestamp="2025-12-03 13:19:35 +0000 UTC" firstStartedPulling="2025-12-03 13:19:36.65976583 +0000 UTC m=+3965.504726881" lastFinishedPulling="2025-12-03 13:19:43.917762975 +0000 UTC m=+3972.762724026" observedRunningTime="2025-12-03 13:19:47.080339402 +0000 UTC m=+3975.925300463" watchObservedRunningTime="2025-12-03 13:19:47.082218762 +0000 UTC m=+3975.927179813" Dec 03 13:19:48 crc kubenswrapper[4666]: I1203 13:19:48.047908 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0db6845-8502-4fa7-acdf-20c2395ca177","Type":"ContainerStarted","Data":"c19687d5ac5be8b442530121ff837a7e5b9eef353032b0484e29831f602ce0ab"} Dec 03 13:19:48 crc kubenswrapper[4666]: I1203 13:19:48.083475 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.083450146 podStartE2EDuration="4.083450146s" podCreationTimestamp="2025-12-03 13:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:19:48.078129572 +0000 UTC m=+3976.923090633" watchObservedRunningTime="2025-12-03 13:19:48.083450146 +0000 UTC m=+3976.928411217" Dec 03 13:19:49 crc kubenswrapper[4666]: I1203 13:19:49.065004 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8c0a38f-bf71-4907-ba78-bef2e7227dc6","Type":"ContainerStarted","Data":"1177f708587cf77b3debf2b6b2f7374d7e40bc7e4ad035e65161c890926831c0"} Dec 03 13:19:49 crc kubenswrapper[4666]: I1203 13:19:49.093784 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.093757812 podStartE2EDuration="5.093757812s" podCreationTimestamp="2025-12-03 13:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:19:49.092794616 +0000 UTC m=+3977.937755687" watchObservedRunningTime="2025-12-03 13:19:49.093757812 +0000 UTC m=+3977.938718863" Dec 03 13:19:49 crc kubenswrapper[4666]: I1203 13:19:49.423665 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:19:49 crc kubenswrapper[4666]: E1203 13:19:49.424010 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:19:53 crc kubenswrapper[4666]: I1203 13:19:53.154872 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:19:53 crc kubenswrapper[4666]: I1203 13:19:53.340404 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:19:54 crc kubenswrapper[4666]: I1203 13:19:54.133530 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m76w2" event={"ID":"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3","Type":"ContainerStarted","Data":"5ec4cfea7301a32be660a94ab5fbcf166ae3a6a063e235c19d7f31d6d1ab318e"} Dec 03 13:19:54 crc kubenswrapper[4666]: I1203 13:19:54.162541 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-m76w2" podStartSLOduration=7.257419628 podStartE2EDuration="16.162514803s" podCreationTimestamp="2025-12-03 13:19:38 +0000 UTC" firstStartedPulling="2025-12-03 13:19:44.433664623 +0000 UTC m=+3973.278625664" lastFinishedPulling="2025-12-03 13:19:53.338759788 +0000 UTC m=+3982.183720839" observedRunningTime="2025-12-03 13:19:54.149880423 +0000 UTC m=+3982.994841484" watchObservedRunningTime="2025-12-03 13:19:54.162514803 +0000 UTC m=+3983.007475874" Dec 03 13:19:54 crc kubenswrapper[4666]: I1203 13:19:54.853489 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 13:19:54 crc kubenswrapper[4666]: I1203 13:19:54.853544 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 13:19:54 crc kubenswrapper[4666]: I1203 13:19:54.892355 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 13:19:54 crc kubenswrapper[4666]: I1203 13:19:54.898880 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.145279 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.145556 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.187472 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.188233 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.234501 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.235950 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.913252 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.913436 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.914805 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-686f87d7cd-k4tjz" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.938435 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:55 crc kubenswrapper[4666]: I1203 13:19:55.938508 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:19:56 crc kubenswrapper[4666]: I1203 13:19:56.157838 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:56 crc kubenswrapper[4666]: I1203 13:19:56.157874 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 13:19:58 crc kubenswrapper[4666]: I1203 13:19:58.172992 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 13:19:58 crc kubenswrapper[4666]: I1203 13:19:58.173505 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 13:20:01 crc kubenswrapper[4666]: I1203 13:20:01.057381 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 13:20:01 crc kubenswrapper[4666]: I1203 13:20:01.058163 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 13:20:01 crc kubenswrapper[4666]: I1203 13:20:01.059996 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 13:20:01 crc kubenswrapper[4666]: I1203 13:20:01.060063 4666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 13:20:01 crc kubenswrapper[4666]: I1203 13:20:01.088059 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 13:20:01 crc kubenswrapper[4666]: I1203 13:20:01.099951 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 13:20:02 crc kubenswrapper[4666]: I1203 13:20:02.424129 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:20:02 crc kubenswrapper[4666]: E1203 13:20:02.426655 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:20:05 crc kubenswrapper[4666]: I1203 13:20:05.914028 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-686f87d7cd-k4tjz" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 03 13:20:05 crc kubenswrapper[4666]: I1203 13:20:05.942302 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bd58698c4-v4zw4" podUID="a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Dec 03 13:20:12 crc kubenswrapper[4666]: I1203 13:20:12.299868 4666 generic.go:334] "Generic (PLEG): container finished" podID="1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" containerID="5ec4cfea7301a32be660a94ab5fbcf166ae3a6a063e235c19d7f31d6d1ab318e" exitCode=0 Dec 03 13:20:12 crc kubenswrapper[4666]: I1203 13:20:12.299962 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m76w2" event={"ID":"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3","Type":"ContainerDied","Data":"5ec4cfea7301a32be660a94ab5fbcf166ae3a6a063e235c19d7f31d6d1ab318e"} Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.423336 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:20:13 crc kubenswrapper[4666]: E1203 13:20:13.423674 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.795454 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m76w2" Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.966797 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh9lt\" (UniqueName: \"kubernetes.io/projected/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-kube-api-access-dh9lt\") pod \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.966948 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-job-config-data\") pod \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.967049 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-combined-ca-bundle\") pod \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.967156 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-config-data\") pod \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\" (UID: \"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3\") " Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.972651 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-kube-api-access-dh9lt" (OuterVolumeSpecName: "kube-api-access-dh9lt") pod "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" (UID: "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3"). InnerVolumeSpecName "kube-api-access-dh9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.973237 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" (UID: "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.975177 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-config-data" (OuterVolumeSpecName: "config-data") pod "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" (UID: "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:13 crc kubenswrapper[4666]: I1203 13:20:13.995133 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" (UID: "1fd740ba-30fe-4c8c-9a36-9d1c56170bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.069859 4666 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.069899 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.069909 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.069921 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh9lt\" (UniqueName: \"kubernetes.io/projected/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3-kube-api-access-dh9lt\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.321605 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-m76w2" event={"ID":"1fd740ba-30fe-4c8c-9a36-9d1c56170bb3","Type":"ContainerDied","Data":"4545b9100359ac8b572d546e576865a566e11e2e62852f659e8df1e5ab8d067f"} Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.321653 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4545b9100359ac8b572d546e576865a566e11e2e62852f659e8df1e5ab8d067f" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.321690 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-m76w2" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.688326 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wjfgn"] Dec 03 13:20:14 crc kubenswrapper[4666]: E1203 13:20:14.688893 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" containerName="manila-db-sync" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.688911 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" containerName="manila-db-sync" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.689175 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" containerName="manila-db-sync" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.690248 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.712802 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wjfgn"] Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.770161 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.772228 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.777762 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.777880 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-r9x5z" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.778032 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.788116 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.800042 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.802261 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.803553 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.807008 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.870052 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.894191 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-config\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.894486 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfae0700-7f7f-472a-878f-9deefb4de325-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.894583 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz8l2\" (UniqueName: \"kubernetes.io/projected/cfae0700-7f7f-472a-878f-9deefb4de325-kube-api-access-gz8l2\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.894679 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.897664 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.900304 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.900800 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.901081 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2j6f\" (UniqueName: \"kubernetes.io/projected/a761ff17-0cbb-43ca-83bc-5fb2b684203f-kube-api-access-s2j6f\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.901412 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.901646 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.901808 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.901954 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-scripts\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:14 crc kubenswrapper[4666]: I1203 13:20:14.991989 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.000346 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.007646 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-ceph\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008519 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-scripts\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008548 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-config\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008579 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfae0700-7f7f-472a-878f-9deefb4de325-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008601 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008623 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz8l2\" (UniqueName: \"kubernetes.io/projected/cfae0700-7f7f-472a-878f-9deefb4de325-kube-api-access-gz8l2\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008647 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008667 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-scripts\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008721 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008749 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008767 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008792 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008819 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlrwt\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-kube-api-access-tlrwt\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008890 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008918 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008937 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.008967 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2j6f\" (UniqueName: \"kubernetes.io/projected/a761ff17-0cbb-43ca-83bc-5fb2b684203f-kube-api-access-s2j6f\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.009000 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.009019 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.009043 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.009935 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.012990 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.015503 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-config\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.015569 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfae0700-7f7f-472a-878f-9deefb4de325-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.015990 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.016300 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.017308 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a761ff17-0cbb-43ca-83bc-5fb2b684203f-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.031813 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.042108 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.042841 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-scripts\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.045181 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.046403 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.058120 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2j6f\" (UniqueName: \"kubernetes.io/projected/a761ff17-0cbb-43ca-83bc-5fb2b684203f-kube-api-access-s2j6f\") pod \"dnsmasq-dns-76b5fdb995-wjfgn\" (UID: \"a761ff17-0cbb-43ca-83bc-5fb2b684203f\") " pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.058717 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz8l2\" (UniqueName: \"kubernetes.io/projected/cfae0700-7f7f-472a-878f-9deefb4de325-kube-api-access-gz8l2\") pod \"manila-scheduler-0\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111326 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-scripts\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111394 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111429 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111451 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data-custom\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111471 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111499 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlrwt\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-kube-api-access-tlrwt\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111534 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111559 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111575 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111608 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-etc-machine-id\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111630 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-scripts\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111645 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsnw2\" (UniqueName: \"kubernetes.io/projected/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-kube-api-access-gsnw2\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111670 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-logs\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111711 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-ceph\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.111748 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.112363 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.113696 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.116799 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-scripts\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.117042 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.117475 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.118298 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.120611 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-ceph\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.132293 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlrwt\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-kube-api-access-tlrwt\") pod \"manila-share-share1-0\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.170493 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.188616 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.213267 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.213389 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-etc-machine-id\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.213433 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-scripts\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.213457 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsnw2\" (UniqueName: \"kubernetes.io/projected/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-kube-api-access-gsnw2\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.213493 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-logs\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.213721 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data-custom\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.213759 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.214841 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-logs\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.214970 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-etc-machine-id\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.221953 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.222041 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.222561 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data-custom\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.233672 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-scripts\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.250021 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsnw2\" (UniqueName: \"kubernetes.io/projected/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-kube-api-access-gsnw2\") pod \"manila-api-0\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.267121 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.307392 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.799671 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:15 crc kubenswrapper[4666]: W1203 13:20:15.814103 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfae0700_7f7f_472a_878f_9deefb4de325.slice/crio-93bb73ce06f27b84f76f9f21d9ae02c7cbce9a46cfa6790fbf3c73a201a01110 WatchSource:0}: Error finding container 93bb73ce06f27b84f76f9f21d9ae02c7cbce9a46cfa6790fbf3c73a201a01110: Status 404 returned error can't find the container with id 93bb73ce06f27b84f76f9f21d9ae02c7cbce9a46cfa6790fbf3c73a201a01110 Dec 03 13:20:15 crc kubenswrapper[4666]: I1203 13:20:15.988506 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:16 crc kubenswrapper[4666]: I1203 13:20:16.099536 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wjfgn"] Dec 03 13:20:16 crc kubenswrapper[4666]: I1203 13:20:16.345270 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cfae0700-7f7f-472a-878f-9deefb4de325","Type":"ContainerStarted","Data":"93bb73ce06f27b84f76f9f21d9ae02c7cbce9a46cfa6790fbf3c73a201a01110"} Dec 03 13:20:16 crc kubenswrapper[4666]: W1203 13:20:16.504674 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8ee0d2_2e33_48ac_af6f_ad4f01d73b87.slice/crio-cbda518357ab4d2b665291f8e4e239bf9b23c8482f687d4349fe577d2834249d WatchSource:0}: Error finding container cbda518357ab4d2b665291f8e4e239bf9b23c8482f687d4349fe577d2834249d: Status 404 returned error can't find the container with id cbda518357ab4d2b665291f8e4e239bf9b23c8482f687d4349fe577d2834249d Dec 03 13:20:16 crc kubenswrapper[4666]: I1203 13:20:16.986937 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.357783 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d2a2094d-31c4-40de-8a57-03b3f33f8d5c","Type":"ContainerStarted","Data":"8eb256b1c2cae0f96f2f08445268df6014c39d5519ac505192433c0d8ec119dd"} Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.358834 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" event={"ID":"a761ff17-0cbb-43ca-83bc-5fb2b684203f","Type":"ContainerStarted","Data":"60f59da2185dd999a9a13077ce6c181fb0b425e08e0147bc199fd36c1d9c5860"} Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.358858 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" event={"ID":"a761ff17-0cbb-43ca-83bc-5fb2b684203f","Type":"ContainerStarted","Data":"ba5b258932bc4e5de4999bf1b4b39bf2d2c4e1d37d9e9db20270c42ce7c14e5a"} Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.360710 4666 generic.go:334] "Generic (PLEG): container finished" podID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerID="061b4fe81081a4b9cda4d16240d3ca66a498b11aacbe729591d2342a3b6da6f0" exitCode=137 Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.360730 4666 generic.go:334] "Generic (PLEG): container finished" podID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerID="202fc9ef7da1958d2e93e10fb94a83def433ab7e4a03be4f4511ca123b5c7da4" exitCode=137 Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.360765 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db5796857-lghnw" event={"ID":"ab5d2c81-de3b-409e-9dae-63770077a8b4","Type":"ContainerDied","Data":"061b4fe81081a4b9cda4d16240d3ca66a498b11aacbe729591d2342a3b6da6f0"} Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.360783 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db5796857-lghnw" event={"ID":"ab5d2c81-de3b-409e-9dae-63770077a8b4","Type":"ContainerDied","Data":"202fc9ef7da1958d2e93e10fb94a83def433ab7e4a03be4f4511ca123b5c7da4"} Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.363190 4666 generic.go:334] "Generic (PLEG): container finished" podID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerID="c75c1562d8b8e1c742f915b9019050855a4067b73c19f6a2c011c2c08122be41" exitCode=137 Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.363209 4666 generic.go:334] "Generic (PLEG): container finished" podID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerID="b3dfc8d2b922aa5ba9a47c277fe1e56d0a399b83fa118f918b654ba1f9b303fa" exitCode=137 Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.363243 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f477cf945-89g4r" event={"ID":"8ebdb93b-7a58-46b5-82d6-28c35c6ff575","Type":"ContainerDied","Data":"c75c1562d8b8e1c742f915b9019050855a4067b73c19f6a2c011c2c08122be41"} Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.363258 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f477cf945-89g4r" event={"ID":"8ebdb93b-7a58-46b5-82d6-28c35c6ff575","Type":"ContainerDied","Data":"b3dfc8d2b922aa5ba9a47c277fe1e56d0a399b83fa118f918b654ba1f9b303fa"} Dec 03 13:20:17 crc kubenswrapper[4666]: I1203 13:20:17.363968 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87","Type":"ContainerStarted","Data":"cbda518357ab4d2b665291f8e4e239bf9b23c8482f687d4349fe577d2834249d"} Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.347392 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.389526 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.390927 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.391805 4666 generic.go:334] "Generic (PLEG): container finished" podID="a761ff17-0cbb-43ca-83bc-5fb2b684203f" containerID="60f59da2185dd999a9a13077ce6c181fb0b425e08e0147bc199fd36c1d9c5860" exitCode=0 Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.391868 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" event={"ID":"a761ff17-0cbb-43ca-83bc-5fb2b684203f","Type":"ContainerDied","Data":"60f59da2185dd999a9a13077ce6c181fb0b425e08e0147bc199fd36c1d9c5860"} Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401022 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-scripts\") pod \"ab5d2c81-de3b-409e-9dae-63770077a8b4\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401067 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9w9s\" (UniqueName: \"kubernetes.io/projected/ab5d2c81-de3b-409e-9dae-63770077a8b4-kube-api-access-k9w9s\") pod \"ab5d2c81-de3b-409e-9dae-63770077a8b4\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401124 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-config-data\") pod \"ab5d2c81-de3b-409e-9dae-63770077a8b4\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401147 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-config-data\") pod \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401641 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-scripts\") pod \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401737 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab5d2c81-de3b-409e-9dae-63770077a8b4-logs\") pod \"ab5d2c81-de3b-409e-9dae-63770077a8b4\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401829 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ab5d2c81-de3b-409e-9dae-63770077a8b4-horizon-secret-key\") pod \"ab5d2c81-de3b-409e-9dae-63770077a8b4\" (UID: \"ab5d2c81-de3b-409e-9dae-63770077a8b4\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401924 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-horizon-secret-key\") pod \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401951 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kghx\" (UniqueName: \"kubernetes.io/projected/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-kube-api-access-2kghx\") pod \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.401979 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-logs\") pod \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\" (UID: \"8ebdb93b-7a58-46b5-82d6-28c35c6ff575\") " Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.402219 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5db5796857-lghnw" event={"ID":"ab5d2c81-de3b-409e-9dae-63770077a8b4","Type":"ContainerDied","Data":"94adce438f7d6ea77839d3a3566149108369fd74e9565770a5e55dfd43c51cce"} Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.402291 4666 scope.go:117] "RemoveContainer" containerID="061b4fe81081a4b9cda4d16240d3ca66a498b11aacbe729591d2342a3b6da6f0" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.408986 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5db5796857-lghnw" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.418357 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab5d2c81-de3b-409e-9dae-63770077a8b4-logs" (OuterVolumeSpecName: "logs") pod "ab5d2c81-de3b-409e-9dae-63770077a8b4" (UID: "ab5d2c81-de3b-409e-9dae-63770077a8b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.422664 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-logs" (OuterVolumeSpecName: "logs") pod "8ebdb93b-7a58-46b5-82d6-28c35c6ff575" (UID: "8ebdb93b-7a58-46b5-82d6-28c35c6ff575"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.427075 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5d2c81-de3b-409e-9dae-63770077a8b4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ab5d2c81-de3b-409e-9dae-63770077a8b4" (UID: "ab5d2c81-de3b-409e-9dae-63770077a8b4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.463369 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f477cf945-89g4r" event={"ID":"8ebdb93b-7a58-46b5-82d6-28c35c6ff575","Type":"ContainerDied","Data":"6c163b3b0b702deaa1bc68983e0857b1b26183c49b3c113c45cf8078fdcc1f49"} Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.463514 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f477cf945-89g4r" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.465672 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8ebdb93b-7a58-46b5-82d6-28c35c6ff575" (UID: "8ebdb93b-7a58-46b5-82d6-28c35c6ff575"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.470361 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87","Type":"ContainerStarted","Data":"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26"} Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.488923 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5d2c81-de3b-409e-9dae-63770077a8b4-kube-api-access-k9w9s" (OuterVolumeSpecName: "kube-api-access-k9w9s") pod "ab5d2c81-de3b-409e-9dae-63770077a8b4" (UID: "ab5d2c81-de3b-409e-9dae-63770077a8b4"). InnerVolumeSpecName "kube-api-access-k9w9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.511595 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9w9s\" (UniqueName: \"kubernetes.io/projected/ab5d2c81-de3b-409e-9dae-63770077a8b4-kube-api-access-k9w9s\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.511622 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab5d2c81-de3b-409e-9dae-63770077a8b4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.511637 4666 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ab5d2c81-de3b-409e-9dae-63770077a8b4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.511649 4666 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.511657 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-logs\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.541516 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-config-data" (OuterVolumeSpecName: "config-data") pod "ab5d2c81-de3b-409e-9dae-63770077a8b4" (UID: "ab5d2c81-de3b-409e-9dae-63770077a8b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.540548 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-kube-api-access-2kghx" (OuterVolumeSpecName: "kube-api-access-2kghx") pod "8ebdb93b-7a58-46b5-82d6-28c35c6ff575" (UID: "8ebdb93b-7a58-46b5-82d6-28c35c6ff575"). InnerVolumeSpecName "kube-api-access-2kghx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.554957 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-scripts" (OuterVolumeSpecName: "scripts") pod "ab5d2c81-de3b-409e-9dae-63770077a8b4" (UID: "ab5d2c81-de3b-409e-9dae-63770077a8b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.599885 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-config-data" (OuterVolumeSpecName: "config-data") pod "8ebdb93b-7a58-46b5-82d6-28c35c6ff575" (UID: "8ebdb93b-7a58-46b5-82d6-28c35c6ff575"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.614643 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.614669 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab5d2c81-de3b-409e-9dae-63770077a8b4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.614677 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.614687 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kghx\" (UniqueName: \"kubernetes.io/projected/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-kube-api-access-2kghx\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.634243 4666 scope.go:117] "RemoveContainer" containerID="202fc9ef7da1958d2e93e10fb94a83def433ab7e4a03be4f4511ca123b5c7da4" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.669326 4666 scope.go:117] "RemoveContainer" containerID="c75c1562d8b8e1c742f915b9019050855a4067b73c19f6a2c011c2c08122be41" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.680801 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-scripts" (OuterVolumeSpecName: "scripts") pod "8ebdb93b-7a58-46b5-82d6-28c35c6ff575" (UID: "8ebdb93b-7a58-46b5-82d6-28c35c6ff575"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.716190 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ebdb93b-7a58-46b5-82d6-28c35c6ff575-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.747983 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5db5796857-lghnw"] Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.759058 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5db5796857-lghnw"] Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.808651 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f477cf945-89g4r"] Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.825266 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f477cf945-89g4r"] Dec 03 13:20:18 crc kubenswrapper[4666]: I1203 13:20:18.880836 4666 scope.go:117] "RemoveContainer" containerID="b3dfc8d2b922aa5ba9a47c277fe1e56d0a399b83fa118f918b654ba1f9b303fa" Dec 03 13:20:19 crc kubenswrapper[4666]: I1203 13:20:19.441919 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" path="/var/lib/kubelet/pods/8ebdb93b-7a58-46b5-82d6-28c35c6ff575/volumes" Dec 03 13:20:19 crc kubenswrapper[4666]: I1203 13:20:19.443478 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" path="/var/lib/kubelet/pods/ab5d2c81-de3b-409e-9dae-63770077a8b4/volumes" Dec 03 13:20:19 crc kubenswrapper[4666]: I1203 13:20:19.557058 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:20:19 crc kubenswrapper[4666]: I1203 13:20:19.565547 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.500974 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cfae0700-7f7f-472a-878f-9deefb4de325","Type":"ContainerStarted","Data":"58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3"} Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.502115 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cfae0700-7f7f-472a-878f-9deefb4de325","Type":"ContainerStarted","Data":"50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a"} Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.504516 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87","Type":"ContainerStarted","Data":"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90"} Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.504625 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.504630 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api-log" containerID="cri-o://1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26" gracePeriod=30 Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.504666 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api" containerID="cri-o://bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90" gracePeriod=30 Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.523075 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.402271278 podStartE2EDuration="6.523056679s" podCreationTimestamp="2025-12-03 13:20:14 +0000 UTC" firstStartedPulling="2025-12-03 13:20:15.816266262 +0000 UTC m=+4004.661227313" lastFinishedPulling="2025-12-03 13:20:17.937051663 +0000 UTC m=+4006.782012714" observedRunningTime="2025-12-03 13:20:20.520326805 +0000 UTC m=+4009.365287856" watchObservedRunningTime="2025-12-03 13:20:20.523056679 +0000 UTC m=+4009.368017720" Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.533510 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" event={"ID":"a761ff17-0cbb-43ca-83bc-5fb2b684203f","Type":"ContainerStarted","Data":"19865a35249c70ab5427c6520723595de3468dab8e2869c4c0dc10112dc05fc6"} Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.533714 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.552456 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.55243443 podStartE2EDuration="6.55243443s" podCreationTimestamp="2025-12-03 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:20:20.54538991 +0000 UTC m=+4009.390350971" watchObservedRunningTime="2025-12-03 13:20:20.55243443 +0000 UTC m=+4009.397395471" Dec 03 13:20:20 crc kubenswrapper[4666]: I1203 13:20:20.568790 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" podStartSLOduration=6.568771109 podStartE2EDuration="6.568771109s" podCreationTimestamp="2025-12-03 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:20:20.564424102 +0000 UTC m=+4009.409385153" watchObservedRunningTime="2025-12-03 13:20:20.568771109 +0000 UTC m=+4009.413732170" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.250129 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.331156 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data\") pod \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.331211 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data-custom\") pod \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.331360 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-etc-machine-id\") pod \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.331409 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsnw2\" (UniqueName: \"kubernetes.io/projected/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-kube-api-access-gsnw2\") pod \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.331439 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-logs\") pod \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.331501 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-scripts\") pod \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.331560 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-combined-ca-bundle\") pod \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\" (UID: \"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87\") " Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.332843 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" (UID: "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.333485 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-logs" (OuterVolumeSpecName: "logs") pod "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" (UID: "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.339811 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-kube-api-access-gsnw2" (OuterVolumeSpecName: "kube-api-access-gsnw2") pod "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" (UID: "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87"). InnerVolumeSpecName "kube-api-access-gsnw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.340010 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-scripts" (OuterVolumeSpecName: "scripts") pod "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" (UID: "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.341201 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" (UID: "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.395529 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" (UID: "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.396135 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data" (OuterVolumeSpecName: "config-data") pod "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" (UID: "ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.433589 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.433654 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.433666 4666 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.433682 4666 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.433695 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsnw2\" (UniqueName: \"kubernetes.io/projected/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-kube-api-access-gsnw2\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.433708 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-logs\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.433720 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.573992 4666 generic.go:334] "Generic (PLEG): container finished" podID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerID="bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90" exitCode=0 Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.574038 4666 generic.go:334] "Generic (PLEG): container finished" podID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerID="1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26" exitCode=143 Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.574752 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.576448 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87","Type":"ContainerDied","Data":"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90"} Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.576515 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87","Type":"ContainerDied","Data":"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26"} Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.576530 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87","Type":"ContainerDied","Data":"cbda518357ab4d2b665291f8e4e239bf9b23c8482f687d4349fe577d2834249d"} Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.576550 4666 scope.go:117] "RemoveContainer" containerID="bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.614421 4666 scope.go:117] "RemoveContainer" containerID="1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.633558 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.637138 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.648303 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:21 crc kubenswrapper[4666]: E1203 13:20:21.648738 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon-log" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.648753 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon-log" Dec 03 13:20:21 crc kubenswrapper[4666]: E1203 13:20:21.648774 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.648782 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api" Dec 03 13:20:21 crc kubenswrapper[4666]: E1203 13:20:21.648793 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.648800 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon" Dec 03 13:20:21 crc kubenswrapper[4666]: E1203 13:20:21.648828 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon-log" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.648834 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon-log" Dec 03 13:20:21 crc kubenswrapper[4666]: E1203 13:20:21.648841 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.648848 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon" Dec 03 13:20:21 crc kubenswrapper[4666]: E1203 13:20:21.648859 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api-log" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.648865 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api-log" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.649049 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.649258 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.649283 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon-log" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.649296 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebdb93b-7a58-46b5-82d6-28c35c6ff575" containerName="horizon" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.649308 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5d2c81-de3b-409e-9dae-63770077a8b4" containerName="horizon-log" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.650232 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" containerName="manila-api-log" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.651942 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.655674 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.655770 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.655905 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.662005 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.711473 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.773437 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8015a12c-752c-489f-a52f-da3bf0ab2977-logs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.773763 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-config-data-custom\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.773823 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-public-tls-certs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.773885 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-internal-tls-certs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.773923 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8015a12c-752c-489f-a52f-da3bf0ab2977-etc-machine-id\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.773959 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-scripts\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.774020 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq95w\" (UniqueName: \"kubernetes.io/projected/8015a12c-752c-489f-a52f-da3bf0ab2977-kube-api-access-cq95w\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.774106 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-config-data\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.774170 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.828770 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6bd58698c4-v4zw4" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877112 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8015a12c-752c-489f-a52f-da3bf0ab2977-logs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877172 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-config-data-custom\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877231 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-public-tls-certs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877324 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-internal-tls-certs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877374 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8015a12c-752c-489f-a52f-da3bf0ab2977-etc-machine-id\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877421 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-scripts\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877518 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq95w\" (UniqueName: \"kubernetes.io/projected/8015a12c-752c-489f-a52f-da3bf0ab2977-kube-api-access-cq95w\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877583 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-config-data\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.877654 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.878442 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8015a12c-752c-489f-a52f-da3bf0ab2977-logs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.878989 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8015a12c-752c-489f-a52f-da3bf0ab2977-etc-machine-id\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.887758 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-internal-tls-certs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.888505 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-scripts\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.890989 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-config-data\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.891980 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-config-data-custom\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.902283 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.910215 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq95w\" (UniqueName: \"kubernetes.io/projected/8015a12c-752c-489f-a52f-da3bf0ab2977-kube-api-access-cq95w\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.922968 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8015a12c-752c-489f-a52f-da3bf0ab2977-public-tls-certs\") pod \"manila-api-0\" (UID: \"8015a12c-752c-489f-a52f-da3bf0ab2977\") " pod="openstack/manila-api-0" Dec 03 13:20:21 crc kubenswrapper[4666]: I1203 13:20:21.945252 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-686f87d7cd-k4tjz"] Dec 03 13:20:22 crc kubenswrapper[4666]: I1203 13:20:22.021184 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 13:20:22 crc kubenswrapper[4666]: I1203 13:20:22.584948 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-686f87d7cd-k4tjz" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon-log" containerID="cri-o://7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6" gracePeriod=30 Dec 03 13:20:22 crc kubenswrapper[4666]: I1203 13:20:22.585150 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-686f87d7cd-k4tjz" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" containerID="cri-o://40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf" gracePeriod=30 Dec 03 13:20:23 crc kubenswrapper[4666]: I1203 13:20:23.437101 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87" path="/var/lib/kubelet/pods/ff8ee0d2-2e33-48ac-af6f-ad4f01d73b87/volumes" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.230607 4666 scope.go:117] "RemoveContainer" containerID="bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90" Dec 03 13:20:24 crc kubenswrapper[4666]: E1203 13:20:24.231017 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90\": container with ID starting with bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90 not found: ID does not exist" containerID="bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.231053 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90"} err="failed to get container status \"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90\": rpc error: code = NotFound desc = could not find container \"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90\": container with ID starting with bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90 not found: ID does not exist" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.231076 4666 scope.go:117] "RemoveContainer" containerID="1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26" Dec 03 13:20:24 crc kubenswrapper[4666]: E1203 13:20:24.231429 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26\": container with ID starting with 1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26 not found: ID does not exist" containerID="1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.231453 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26"} err="failed to get container status \"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26\": rpc error: code = NotFound desc = could not find container \"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26\": container with ID starting with 1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26 not found: ID does not exist" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.231472 4666 scope.go:117] "RemoveContainer" containerID="bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.232246 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90"} err="failed to get container status \"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90\": rpc error: code = NotFound desc = could not find container \"bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90\": container with ID starting with bcb592e53296f2358a4b338a651c21d6cd6cb448f516c2899477f0172e9cff90 not found: ID does not exist" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.232270 4666 scope.go:117] "RemoveContainer" containerID="1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.232451 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26"} err="failed to get container status \"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26\": rpc error: code = NotFound desc = could not find container \"1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26\": container with ID starting with 1bef81d9bdf4768a5f49d08bff00bff0ed6dc45a28cc7c71bbf7123351c0ef26 not found: ID does not exist" Dec 03 13:20:24 crc kubenswrapper[4666]: I1203 13:20:24.849349 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 03 13:20:24 crc kubenswrapper[4666]: W1203 13:20:24.867495 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8015a12c_752c_489f_a52f_da3bf0ab2977.slice/crio-8b22482f138ea09e0c0b2b47753b50c290773762765adffeba100cdd2624ae2d WatchSource:0}: Error finding container 8b22482f138ea09e0c0b2b47753b50c290773762765adffeba100cdd2624ae2d: Status 404 returned error can't find the container with id 8b22482f138ea09e0c0b2b47753b50c290773762765adffeba100cdd2624ae2d Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.170792 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.309270 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-wjfgn" Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.401628 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-ncm8v"] Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.401879 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" podUID="30859df6-46bb-4671-a9e9-def5132425af" containerName="dnsmasq-dns" containerID="cri-o://4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823" gracePeriod=10 Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.614809 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8015a12c-752c-489f-a52f-da3bf0ab2977","Type":"ContainerStarted","Data":"8b22482f138ea09e0c0b2b47753b50c290773762765adffeba100cdd2624ae2d"} Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.616998 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d2a2094d-31c4-40de-8a57-03b3f33f8d5c","Type":"ContainerStarted","Data":"427f331bfd56c5fd322e91a34ece5b0e7374c3082e7f00b950a40227e72aaae4"} Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.617265 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d2a2094d-31c4-40de-8a57-03b3f33f8d5c","Type":"ContainerStarted","Data":"40429e16b1e25417d231c0f1a7b54e8b7b5f43f5d21d37ce1f140693a5012a65"} Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.686201 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.427992019 podStartE2EDuration="11.686175479s" podCreationTimestamp="2025-12-03 13:20:14 +0000 UTC" firstStartedPulling="2025-12-03 13:20:17.046401817 +0000 UTC m=+4005.891362868" lastFinishedPulling="2025-12-03 13:20:24.304585287 +0000 UTC m=+4013.149546328" observedRunningTime="2025-12-03 13:20:25.682679735 +0000 UTC m=+4014.527640786" watchObservedRunningTime="2025-12-03 13:20:25.686175479 +0000 UTC m=+4014.531136530" Dec 03 13:20:25 crc kubenswrapper[4666]: I1203 13:20:25.913846 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-686f87d7cd-k4tjz" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.191580 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.275357 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-dns-svc\") pod \"30859df6-46bb-4671-a9e9-def5132425af\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.275435 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-openstack-edpm-ipam\") pod \"30859df6-46bb-4671-a9e9-def5132425af\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.275468 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mvxw\" (UniqueName: \"kubernetes.io/projected/30859df6-46bb-4671-a9e9-def5132425af-kube-api-access-5mvxw\") pod \"30859df6-46bb-4671-a9e9-def5132425af\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.275522 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-nb\") pod \"30859df6-46bb-4671-a9e9-def5132425af\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.275537 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-sb\") pod \"30859df6-46bb-4671-a9e9-def5132425af\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.275612 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-config\") pod \"30859df6-46bb-4671-a9e9-def5132425af\" (UID: \"30859df6-46bb-4671-a9e9-def5132425af\") " Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.290957 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30859df6-46bb-4671-a9e9-def5132425af-kube-api-access-5mvxw" (OuterVolumeSpecName: "kube-api-access-5mvxw") pod "30859df6-46bb-4671-a9e9-def5132425af" (UID: "30859df6-46bb-4671-a9e9-def5132425af"). InnerVolumeSpecName "kube-api-access-5mvxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.350175 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-config" (OuterVolumeSpecName: "config") pod "30859df6-46bb-4671-a9e9-def5132425af" (UID: "30859df6-46bb-4671-a9e9-def5132425af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.359669 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "30859df6-46bb-4671-a9e9-def5132425af" (UID: "30859df6-46bb-4671-a9e9-def5132425af"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.367649 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "30859df6-46bb-4671-a9e9-def5132425af" (UID: "30859df6-46bb-4671-a9e9-def5132425af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.378071 4666 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-config\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.378129 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.378145 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mvxw\" (UniqueName: \"kubernetes.io/projected/30859df6-46bb-4671-a9e9-def5132425af-kube-api-access-5mvxw\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.378157 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.379015 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30859df6-46bb-4671-a9e9-def5132425af" (UID: "30859df6-46bb-4671-a9e9-def5132425af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.392709 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30859df6-46bb-4671-a9e9-def5132425af" (UID: "30859df6-46bb-4671-a9e9-def5132425af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.479756 4666 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.479787 4666 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30859df6-46bb-4671-a9e9-def5132425af-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.626076 4666 generic.go:334] "Generic (PLEG): container finished" podID="30859df6-46bb-4671-a9e9-def5132425af" containerID="4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823" exitCode=0 Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.626154 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.626161 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" event={"ID":"30859df6-46bb-4671-a9e9-def5132425af","Type":"ContainerDied","Data":"4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823"} Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.626196 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-ncm8v" event={"ID":"30859df6-46bb-4671-a9e9-def5132425af","Type":"ContainerDied","Data":"53f33e460d52f12c72a0902fbb8b237e8c3d2e12c79850cbfe44776648162b36"} Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.626233 4666 scope.go:117] "RemoveContainer" containerID="4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823" Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.631794 4666 generic.go:334] "Generic (PLEG): container finished" podID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerID="40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf" exitCode=0 Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.631854 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686f87d7cd-k4tjz" event={"ID":"ae41f444-5132-4df8-80dc-c80a6aea99f8","Type":"ContainerDied","Data":"40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf"} Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.634239 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8015a12c-752c-489f-a52f-da3bf0ab2977","Type":"ContainerStarted","Data":"9b3fba90c3a996f30719dedbcfb2ec8de95d28882a7375c74dee1169dfab4a21"} Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.659488 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-ncm8v"] Dec 03 13:20:26 crc kubenswrapper[4666]: I1203 13:20:26.671726 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-ncm8v"] Dec 03 13:20:27 crc kubenswrapper[4666]: I1203 13:20:27.438235 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30859df6-46bb-4671-a9e9-def5132425af" path="/var/lib/kubelet/pods/30859df6-46bb-4671-a9e9-def5132425af/volumes" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.090387 4666 scope.go:117] "RemoveContainer" containerID="7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.113034 4666 scope.go:117] "RemoveContainer" containerID="4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823" Dec 03 13:20:28 crc kubenswrapper[4666]: E1203 13:20:28.114181 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823\": container with ID starting with 4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823 not found: ID does not exist" containerID="4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.114219 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823"} err="failed to get container status \"4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823\": rpc error: code = NotFound desc = could not find container \"4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823\": container with ID starting with 4efb8a365a58d2469e9571defcdd623210027650a287a5ffa05addc756774823 not found: ID does not exist" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.114249 4666 scope.go:117] "RemoveContainer" containerID="7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2" Dec 03 13:20:28 crc kubenswrapper[4666]: E1203 13:20:28.114767 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2\": container with ID starting with 7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2 not found: ID does not exist" containerID="7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.114826 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2"} err="failed to get container status \"7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2\": rpc error: code = NotFound desc = could not find container \"7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2\": container with ID starting with 7e8fa48680347aa16a877481dffc8401dd8808b9b0bcb6c11a2fee582cd9ccf2 not found: ID does not exist" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.423381 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:20:28 crc kubenswrapper[4666]: E1203 13:20:28.423926 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.656742 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8015a12c-752c-489f-a52f-da3bf0ab2977","Type":"ContainerStarted","Data":"1953a6a2d6f2840eaf4fca14b3f14ba2fba9b04f0ba52a251060577007e4a4f4"} Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.656811 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 03 13:20:28 crc kubenswrapper[4666]: I1203 13:20:28.680394 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=7.680367471 podStartE2EDuration="7.680367471s" podCreationTimestamp="2025-12-03 13:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:20:28.670906317 +0000 UTC m=+4017.515867378" watchObservedRunningTime="2025-12-03 13:20:28.680367471 +0000 UTC m=+4017.525328522" Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.123544 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.124304 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="sg-core" containerID="cri-o://26aea19984fdc2ea693683e6ced5e4ac73f66bffde3b5ddc5109d70e52f5395d" gracePeriod=30 Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.124331 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="proxy-httpd" containerID="cri-o://1eff69a01d383ce295b54faa506ef0c4c14e591212a901c04257e29cc5b8d928" gracePeriod=30 Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.124376 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-notification-agent" containerID="cri-o://888dbaf538a49771c40e431e7d07815607819755fed8244ae3263094bbbb5940" gracePeriod=30 Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.124258 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-central-agent" containerID="cri-o://65a67164bd8029fc0805f9561d317fb45c899cf4dfb2f22fb379789a99c68945" gracePeriod=30 Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.676646 4666 generic.go:334] "Generic (PLEG): container finished" podID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerID="1eff69a01d383ce295b54faa506ef0c4c14e591212a901c04257e29cc5b8d928" exitCode=0 Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.676681 4666 generic.go:334] "Generic (PLEG): container finished" podID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerID="26aea19984fdc2ea693683e6ced5e4ac73f66bffde3b5ddc5109d70e52f5395d" exitCode=2 Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.676734 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerDied","Data":"1eff69a01d383ce295b54faa506ef0c4c14e591212a901c04257e29cc5b8d928"} Dec 03 13:20:30 crc kubenswrapper[4666]: I1203 13:20:30.676829 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerDied","Data":"26aea19984fdc2ea693683e6ced5e4ac73f66bffde3b5ddc5109d70e52f5395d"} Dec 03 13:20:31 crc kubenswrapper[4666]: I1203 13:20:31.695169 4666 generic.go:334] "Generic (PLEG): container finished" podID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerID="65a67164bd8029fc0805f9561d317fb45c899cf4dfb2f22fb379789a99c68945" exitCode=0 Dec 03 13:20:31 crc kubenswrapper[4666]: I1203 13:20:31.695220 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerDied","Data":"65a67164bd8029fc0805f9561d317fb45c899cf4dfb2f22fb379789a99c68945"} Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.456801 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.187:3000/\": dial tcp 10.217.0.187:3000: connect: connection refused" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.710569 4666 generic.go:334] "Generic (PLEG): container finished" podID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerID="888dbaf538a49771c40e431e7d07815607819755fed8244ae3263094bbbb5940" exitCode=0 Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.710611 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerDied","Data":"888dbaf538a49771c40e431e7d07815607819755fed8244ae3263094bbbb5940"} Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.813440 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.910868 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-run-httpd\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911017 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-config-data\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911077 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwvsf\" (UniqueName: \"kubernetes.io/projected/4d139cd6-1cd0-4d2b-a353-808575a9d272-kube-api-access-jwvsf\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911119 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-log-httpd\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911156 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-sg-core-conf-yaml\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911200 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-scripts\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911231 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-combined-ca-bundle\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911274 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-ceilometer-tls-certs\") pod \"4d139cd6-1cd0-4d2b-a353-808575a9d272\" (UID: \"4d139cd6-1cd0-4d2b-a353-808575a9d272\") " Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911436 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911684 4666 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.911754 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.917889 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-scripts" (OuterVolumeSpecName: "scripts") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.928205 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d139cd6-1cd0-4d2b-a353-808575a9d272-kube-api-access-jwvsf" (OuterVolumeSpecName: "kube-api-access-jwvsf") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "kube-api-access-jwvsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.955053 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.989079 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:32 crc kubenswrapper[4666]: I1203 13:20:32.999407 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.014496 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwvsf\" (UniqueName: \"kubernetes.io/projected/4d139cd6-1cd0-4d2b-a353-808575a9d272-kube-api-access-jwvsf\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.014544 4666 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d139cd6-1cd0-4d2b-a353-808575a9d272-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.014556 4666 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.014567 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.014578 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.014589 4666 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.016063 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-config-data" (OuterVolumeSpecName: "config-data") pod "4d139cd6-1cd0-4d2b-a353-808575a9d272" (UID: "4d139cd6-1cd0-4d2b-a353-808575a9d272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.118433 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d139cd6-1cd0-4d2b-a353-808575a9d272-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.723313 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d139cd6-1cd0-4d2b-a353-808575a9d272","Type":"ContainerDied","Data":"38eef3fee6c9d8a2624553ac19ea480a62e31fd015789daf9028dd7522472a9e"} Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.723361 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.723399 4666 scope.go:117] "RemoveContainer" containerID="1eff69a01d383ce295b54faa506ef0c4c14e591212a901c04257e29cc5b8d928" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.757670 4666 scope.go:117] "RemoveContainer" containerID="26aea19984fdc2ea693683e6ced5e4ac73f66bffde3b5ddc5109d70e52f5395d" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.774335 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.785283 4666 scope.go:117] "RemoveContainer" containerID="888dbaf538a49771c40e431e7d07815607819755fed8244ae3263094bbbb5940" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.793033 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.807520 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 13:20:33 crc kubenswrapper[4666]: E1203 13:20:33.808203 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30859df6-46bb-4671-a9e9-def5132425af" containerName="dnsmasq-dns" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808231 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="30859df6-46bb-4671-a9e9-def5132425af" containerName="dnsmasq-dns" Dec 03 13:20:33 crc kubenswrapper[4666]: E1203 13:20:33.808261 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="sg-core" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808270 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="sg-core" Dec 03 13:20:33 crc kubenswrapper[4666]: E1203 13:20:33.808292 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30859df6-46bb-4671-a9e9-def5132425af" containerName="init" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808299 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="30859df6-46bb-4671-a9e9-def5132425af" containerName="init" Dec 03 13:20:33 crc kubenswrapper[4666]: E1203 13:20:33.808309 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="proxy-httpd" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808315 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="proxy-httpd" Dec 03 13:20:33 crc kubenswrapper[4666]: E1203 13:20:33.808322 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-central-agent" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808329 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-central-agent" Dec 03 13:20:33 crc kubenswrapper[4666]: E1203 13:20:33.808338 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-notification-agent" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808344 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-notification-agent" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808511 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="30859df6-46bb-4671-a9e9-def5132425af" containerName="dnsmasq-dns" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808521 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="proxy-httpd" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808537 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-notification-agent" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808553 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="ceilometer-central-agent" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.808561 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" containerName="sg-core" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.810466 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.813734 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.814130 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.814298 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.817867 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.832263 4666 scope.go:117] "RemoveContainer" containerID="65a67164bd8029fc0805f9561d317fb45c899cf4dfb2f22fb379789a99c68945" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.942641 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cad8148a-80c4-407d-a0e0-6fd679f60f89-run-httpd\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.943000 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-scripts\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.943215 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjln\" (UniqueName: \"kubernetes.io/projected/cad8148a-80c4-407d-a0e0-6fd679f60f89-kube-api-access-9jjln\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.943323 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.943477 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-config-data\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.943564 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.943664 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:33 crc kubenswrapper[4666]: I1203 13:20:33.943744 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cad8148a-80c4-407d-a0e0-6fd679f60f89-log-httpd\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.044997 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cad8148a-80c4-407d-a0e0-6fd679f60f89-log-httpd\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.045546 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cad8148a-80c4-407d-a0e0-6fd679f60f89-run-httpd\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.045674 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-scripts\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.045786 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjln\" (UniqueName: \"kubernetes.io/projected/cad8148a-80c4-407d-a0e0-6fd679f60f89-kube-api-access-9jjln\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.045875 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.045553 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cad8148a-80c4-407d-a0e0-6fd679f60f89-log-httpd\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.045974 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cad8148a-80c4-407d-a0e0-6fd679f60f89-run-httpd\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.046073 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-config-data\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.046157 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.046242 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.050232 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-scripts\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.050550 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-config-data\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.056786 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.063275 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.063799 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjln\" (UniqueName: \"kubernetes.io/projected/cad8148a-80c4-407d-a0e0-6fd679f60f89-kube-api-access-9jjln\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.064615 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad8148a-80c4-407d-a0e0-6fd679f60f89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cad8148a-80c4-407d-a0e0-6fd679f60f89\") " pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.136117 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.563827 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 13:20:34 crc kubenswrapper[4666]: W1203 13:20:34.567373 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcad8148a_80c4_407d_a0e0_6fd679f60f89.slice/crio-f712f8828cb42355d0f28844a6838c5751dfae62b1c3a7605fb1e6488b140d6f WatchSource:0}: Error finding container f712f8828cb42355d0f28844a6838c5751dfae62b1c3a7605fb1e6488b140d6f: Status 404 returned error can't find the container with id f712f8828cb42355d0f28844a6838c5751dfae62b1c3a7605fb1e6488b140d6f Dec 03 13:20:34 crc kubenswrapper[4666]: I1203 13:20:34.732396 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cad8148a-80c4-407d-a0e0-6fd679f60f89","Type":"ContainerStarted","Data":"f712f8828cb42355d0f28844a6838c5751dfae62b1c3a7605fb1e6488b140d6f"} Dec 03 13:20:35 crc kubenswrapper[4666]: I1203 13:20:35.192331 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 03 13:20:35 crc kubenswrapper[4666]: I1203 13:20:35.435219 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d139cd6-1cd0-4d2b-a353-808575a9d272" path="/var/lib/kubelet/pods/4d139cd6-1cd0-4d2b-a353-808575a9d272/volumes" Dec 03 13:20:35 crc kubenswrapper[4666]: I1203 13:20:35.743607 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cad8148a-80c4-407d-a0e0-6fd679f60f89","Type":"ContainerStarted","Data":"7e275ecd5a61d997d98a1a9d62dee712df3ed6a46c53dbf707ac70b2c690dca2"} Dec 03 13:20:35 crc kubenswrapper[4666]: I1203 13:20:35.914033 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-686f87d7cd-k4tjz" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.639760 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.678694 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.734108 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.756987 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="manila-scheduler" containerID="cri-o://50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a" gracePeriod=30 Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.757058 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="probe" containerID="cri-o://58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3" gracePeriod=30 Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.774804 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.775079 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="manila-share" containerID="cri-o://40429e16b1e25417d231c0f1a7b54e8b7b5f43f5d21d37ce1f140693a5012a65" gracePeriod=30 Dec 03 13:20:36 crc kubenswrapper[4666]: I1203 13:20:36.775282 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="probe" containerID="cri-o://427f331bfd56c5fd322e91a34ece5b0e7374c3082e7f00b950a40227e72aaae4" gracePeriod=30 Dec 03 13:20:37 crc kubenswrapper[4666]: I1203 13:20:37.774846 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cad8148a-80c4-407d-a0e0-6fd679f60f89","Type":"ContainerStarted","Data":"f8d3ad0bd8d52cf89b6c261db3b2dc81edaa698c5e3c5a30dece514b175c590a"} Dec 03 13:20:37 crc kubenswrapper[4666]: I1203 13:20:37.778903 4666 generic.go:334] "Generic (PLEG): container finished" podID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerID="427f331bfd56c5fd322e91a34ece5b0e7374c3082e7f00b950a40227e72aaae4" exitCode=0 Dec 03 13:20:37 crc kubenswrapper[4666]: I1203 13:20:37.778937 4666 generic.go:334] "Generic (PLEG): container finished" podID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerID="40429e16b1e25417d231c0f1a7b54e8b7b5f43f5d21d37ce1f140693a5012a65" exitCode=1 Dec 03 13:20:37 crc kubenswrapper[4666]: I1203 13:20:37.779001 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d2a2094d-31c4-40de-8a57-03b3f33f8d5c","Type":"ContainerDied","Data":"427f331bfd56c5fd322e91a34ece5b0e7374c3082e7f00b950a40227e72aaae4"} Dec 03 13:20:37 crc kubenswrapper[4666]: I1203 13:20:37.779038 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d2a2094d-31c4-40de-8a57-03b3f33f8d5c","Type":"ContainerDied","Data":"40429e16b1e25417d231c0f1a7b54e8b7b5f43f5d21d37ce1f140693a5012a65"} Dec 03 13:20:37 crc kubenswrapper[4666]: I1203 13:20:37.782470 4666 generic.go:334] "Generic (PLEG): container finished" podID="cfae0700-7f7f-472a-878f-9deefb4de325" containerID="58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3" exitCode=0 Dec 03 13:20:37 crc kubenswrapper[4666]: I1203 13:20:37.782526 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cfae0700-7f7f-472a-878f-9deefb4de325","Type":"ContainerDied","Data":"58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3"} Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.013724 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.032416 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-ceph\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.032474 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlrwt\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-kube-api-access-tlrwt\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.032566 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-var-lib-manila\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.032719 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-scripts\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.032822 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.032889 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-etc-machine-id\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.032921 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-combined-ca-bundle\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.033006 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data-custom\") pod \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\" (UID: \"d2a2094d-31c4-40de-8a57-03b3f33f8d5c\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.036519 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.040061 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.042209 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-ceph" (OuterVolumeSpecName: "ceph") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.044229 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-scripts" (OuterVolumeSpecName: "scripts") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.049324 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-kube-api-access-tlrwt" (OuterVolumeSpecName: "kube-api-access-tlrwt") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "kube-api-access-tlrwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.063682 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.113080 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.135611 4666 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.135654 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.135667 4666 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.135676 4666 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.135686 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlrwt\" (UniqueName: \"kubernetes.io/projected/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-kube-api-access-tlrwt\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.135711 4666 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.135723 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.182693 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data" (OuterVolumeSpecName: "config-data") pod "d2a2094d-31c4-40de-8a57-03b3f33f8d5c" (UID: "d2a2094d-31c4-40de-8a57-03b3f33f8d5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.237818 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a2094d-31c4-40de-8a57-03b3f33f8d5c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.385196 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.443200 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-combined-ca-bundle\") pod \"cfae0700-7f7f-472a-878f-9deefb4de325\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.443270 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfae0700-7f7f-472a-878f-9deefb4de325-etc-machine-id\") pod \"cfae0700-7f7f-472a-878f-9deefb4de325\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.443296 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data-custom\") pod \"cfae0700-7f7f-472a-878f-9deefb4de325\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.443402 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data\") pod \"cfae0700-7f7f-472a-878f-9deefb4de325\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.443477 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-scripts\") pod \"cfae0700-7f7f-472a-878f-9deefb4de325\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.443549 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz8l2\" (UniqueName: \"kubernetes.io/projected/cfae0700-7f7f-472a-878f-9deefb4de325-kube-api-access-gz8l2\") pod \"cfae0700-7f7f-472a-878f-9deefb4de325\" (UID: \"cfae0700-7f7f-472a-878f-9deefb4de325\") " Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.444488 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfae0700-7f7f-472a-878f-9deefb4de325-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cfae0700-7f7f-472a-878f-9deefb4de325" (UID: "cfae0700-7f7f-472a-878f-9deefb4de325"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.448569 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cfae0700-7f7f-472a-878f-9deefb4de325" (UID: "cfae0700-7f7f-472a-878f-9deefb4de325"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.448729 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-scripts" (OuterVolumeSpecName: "scripts") pod "cfae0700-7f7f-472a-878f-9deefb4de325" (UID: "cfae0700-7f7f-472a-878f-9deefb4de325"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.449318 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfae0700-7f7f-472a-878f-9deefb4de325-kube-api-access-gz8l2" (OuterVolumeSpecName: "kube-api-access-gz8l2") pod "cfae0700-7f7f-472a-878f-9deefb4de325" (UID: "cfae0700-7f7f-472a-878f-9deefb4de325"). InnerVolumeSpecName "kube-api-access-gz8l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.495381 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfae0700-7f7f-472a-878f-9deefb4de325" (UID: "cfae0700-7f7f-472a-878f-9deefb4de325"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.545817 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.548465 4666 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfae0700-7f7f-472a-878f-9deefb4de325-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.548535 4666 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.548598 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.548658 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz8l2\" (UniqueName: \"kubernetes.io/projected/cfae0700-7f7f-472a-878f-9deefb4de325-kube-api-access-gz8l2\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.548805 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data" (OuterVolumeSpecName: "config-data") pod "cfae0700-7f7f-472a-878f-9deefb4de325" (UID: "cfae0700-7f7f-472a-878f-9deefb4de325"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.651055 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfae0700-7f7f-472a-878f-9deefb4de325-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.802581 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cad8148a-80c4-407d-a0e0-6fd679f60f89","Type":"ContainerStarted","Data":"53bc9233da449e11f4357c2d7fcb74c9ff1b67acf5cf1cbbe2aa0b492a0a8c6d"} Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.805658 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d2a2094d-31c4-40de-8a57-03b3f33f8d5c","Type":"ContainerDied","Data":"8eb256b1c2cae0f96f2f08445268df6014c39d5519ac505192433c0d8ec119dd"} Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.805718 4666 scope.go:117] "RemoveContainer" containerID="427f331bfd56c5fd322e91a34ece5b0e7374c3082e7f00b950a40227e72aaae4" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.805733 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.813131 4666 generic.go:334] "Generic (PLEG): container finished" podID="cfae0700-7f7f-472a-878f-9deefb4de325" containerID="50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a" exitCode=0 Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.813354 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cfae0700-7f7f-472a-878f-9deefb4de325","Type":"ContainerDied","Data":"50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a"} Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.813459 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cfae0700-7f7f-472a-878f-9deefb4de325","Type":"ContainerDied","Data":"93bb73ce06f27b84f76f9f21d9ae02c7cbce9a46cfa6790fbf3c73a201a01110"} Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.813614 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.843102 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.854880 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.855200 4666 scope.go:117] "RemoveContainer" containerID="40429e16b1e25417d231c0f1a7b54e8b7b5f43f5d21d37ce1f140693a5012a65" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.875530 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: E1203 13:20:38.875995 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="probe" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876011 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="probe" Dec 03 13:20:38 crc kubenswrapper[4666]: E1203 13:20:38.876024 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="manila-scheduler" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876031 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="manila-scheduler" Dec 03 13:20:38 crc kubenswrapper[4666]: E1203 13:20:38.876039 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="manila-share" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876045 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="manila-share" Dec 03 13:20:38 crc kubenswrapper[4666]: E1203 13:20:38.876058 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="probe" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876063 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="probe" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876301 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="manila-share" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876315 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" containerName="probe" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876326 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="probe" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.876334 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" containerName="manila-scheduler" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.877740 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.881209 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.887284 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.898718 4666 scope.go:117] "RemoveContainer" containerID="58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.902175 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.914636 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.937946 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.940750 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.946576 4666 scope.go:117] "RemoveContainer" containerID="50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.953076 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.956427 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-config-data\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.956471 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49rs9\" (UniqueName: \"kubernetes.io/projected/1aee1c1a-28b6-4db6-b927-a484fa641914-kube-api-access-49rs9\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.956524 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.957252 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvq4c\" (UniqueName: \"kubernetes.io/projected/11798731-763e-4a1e-97dc-54a4ff717ddf-kube-api-access-vvq4c\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965291 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-config-data\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965370 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1aee1c1a-28b6-4db6-b927-a484fa641914-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965409 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-scripts\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965451 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/11798731-763e-4a1e-97dc-54a4ff717ddf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965518 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-scripts\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965617 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11798731-763e-4a1e-97dc-54a4ff717ddf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965735 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965828 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.965886 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.966828 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:38 crc kubenswrapper[4666]: I1203 13:20:38.968180 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/11798731-763e-4a1e-97dc-54a4ff717ddf-ceph\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.005922 4666 scope.go:117] "RemoveContainer" containerID="58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3" Dec 03 13:20:39 crc kubenswrapper[4666]: E1203 13:20:39.006275 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3\": container with ID starting with 58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3 not found: ID does not exist" containerID="58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.006310 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3"} err="failed to get container status \"58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3\": rpc error: code = NotFound desc = could not find container \"58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3\": container with ID starting with 58a5b651594c3baf93a55d71d4231f5b2fb710dd1d55579057ded015dee6d2b3 not found: ID does not exist" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.006330 4666 scope.go:117] "RemoveContainer" containerID="50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a" Dec 03 13:20:39 crc kubenswrapper[4666]: E1203 13:20:39.006556 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a\": container with ID starting with 50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a not found: ID does not exist" containerID="50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.006581 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a"} err="failed to get container status \"50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a\": rpc error: code = NotFound desc = could not find container \"50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a\": container with ID starting with 50590b2f555391f388fee9c8f65734bf49091ad4cdcfb920b5f0b256f5815a0a not found: ID does not exist" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070310 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070412 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvq4c\" (UniqueName: \"kubernetes.io/projected/11798731-763e-4a1e-97dc-54a4ff717ddf-kube-api-access-vvq4c\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070452 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-config-data\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070479 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1aee1c1a-28b6-4db6-b927-a484fa641914-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070502 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-scripts\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070529 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/11798731-763e-4a1e-97dc-54a4ff717ddf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070568 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-scripts\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070617 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11798731-763e-4a1e-97dc-54a4ff717ddf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070678 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070723 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070751 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070768 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/11798731-763e-4a1e-97dc-54a4ff717ddf-ceph\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070795 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-config-data\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.070811 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49rs9\" (UniqueName: \"kubernetes.io/projected/1aee1c1a-28b6-4db6-b927-a484fa641914-kube-api-access-49rs9\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.071808 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1aee1c1a-28b6-4db6-b927-a484fa641914-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.073158 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/11798731-763e-4a1e-97dc-54a4ff717ddf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.074602 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.074676 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11798731-763e-4a1e-97dc-54a4ff717ddf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.075235 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-scripts\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.075977 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-config-data\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.076302 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-config-data\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.076659 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.077014 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.077621 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/11798731-763e-4a1e-97dc-54a4ff717ddf-ceph\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.079398 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aee1c1a-28b6-4db6-b927-a484fa641914-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.080760 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11798731-763e-4a1e-97dc-54a4ff717ddf-scripts\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.091157 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49rs9\" (UniqueName: \"kubernetes.io/projected/1aee1c1a-28b6-4db6-b927-a484fa641914-kube-api-access-49rs9\") pod \"manila-scheduler-0\" (UID: \"1aee1c1a-28b6-4db6-b927-a484fa641914\") " pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.096367 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvq4c\" (UniqueName: \"kubernetes.io/projected/11798731-763e-4a1e-97dc-54a4ff717ddf-kube-api-access-vvq4c\") pod \"manila-share-share1-0\" (UID: \"11798731-763e-4a1e-97dc-54a4ff717ddf\") " pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.221877 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.295964 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.439642 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfae0700-7f7f-472a-878f-9deefb4de325" path="/var/lib/kubelet/pods/cfae0700-7f7f-472a-878f-9deefb4de325/volumes" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.443146 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a2094d-31c4-40de-8a57-03b3f33f8d5c" path="/var/lib/kubelet/pods/d2a2094d-31c4-40de-8a57-03b3f33f8d5c/volumes" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.776359 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 13:20:39 crc kubenswrapper[4666]: W1203 13:20:39.778566 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11798731_763e_4a1e_97dc_54a4ff717ddf.slice/crio-5a003c0bdf97967856cd633036df64d9d988ac83065abf55b7983d22757ab059 WatchSource:0}: Error finding container 5a003c0bdf97967856cd633036df64d9d988ac83065abf55b7983d22757ab059: Status 404 returned error can't find the container with id 5a003c0bdf97967856cd633036df64d9d988ac83065abf55b7983d22757ab059 Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.811388 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 13:20:39 crc kubenswrapper[4666]: W1203 13:20:39.821059 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aee1c1a_28b6_4db6_b927_a484fa641914.slice/crio-bda072f7115cb5803e8e5cd9fb801c33416a31acfdcd2056cae1245e940da143 WatchSource:0}: Error finding container bda072f7115cb5803e8e5cd9fb801c33416a31acfdcd2056cae1245e940da143: Status 404 returned error can't find the container with id bda072f7115cb5803e8e5cd9fb801c33416a31acfdcd2056cae1245e940da143 Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.825997 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cad8148a-80c4-407d-a0e0-6fd679f60f89","Type":"ContainerStarted","Data":"42d39739d67f02b67083581584135bb0ddd3bccf24b1b6299c2bb84f7ea0ffca"} Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.826144 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.830957 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"11798731-763e-4a1e-97dc-54a4ff717ddf","Type":"ContainerStarted","Data":"5a003c0bdf97967856cd633036df64d9d988ac83065abf55b7983d22757ab059"} Dec 03 13:20:39 crc kubenswrapper[4666]: I1203 13:20:39.858309 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.998163376 podStartE2EDuration="6.85828983s" podCreationTimestamp="2025-12-03 13:20:33 +0000 UTC" firstStartedPulling="2025-12-03 13:20:34.5696418 +0000 UTC m=+4023.414602851" lastFinishedPulling="2025-12-03 13:20:39.429768254 +0000 UTC m=+4028.274729305" observedRunningTime="2025-12-03 13:20:39.85606098 +0000 UTC m=+4028.701022051" watchObservedRunningTime="2025-12-03 13:20:39.85828983 +0000 UTC m=+4028.703250881" Dec 03 13:20:40 crc kubenswrapper[4666]: I1203 13:20:40.842606 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1aee1c1a-28b6-4db6-b927-a484fa641914","Type":"ContainerStarted","Data":"b6e59a29bc4d90955d7862f1824126125cb20aac8a10987ffe58145910f24d7d"} Dec 03 13:20:40 crc kubenswrapper[4666]: I1203 13:20:40.842964 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1aee1c1a-28b6-4db6-b927-a484fa641914","Type":"ContainerStarted","Data":"bda072f7115cb5803e8e5cd9fb801c33416a31acfdcd2056cae1245e940da143"} Dec 03 13:20:40 crc kubenswrapper[4666]: I1203 13:20:40.845055 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"11798731-763e-4a1e-97dc-54a4ff717ddf","Type":"ContainerStarted","Data":"335cca82081f9889f17a17a001d804a39c86a510efb8b20375468e8e1e41fd11"} Dec 03 13:20:40 crc kubenswrapper[4666]: I1203 13:20:40.845600 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"11798731-763e-4a1e-97dc-54a4ff717ddf","Type":"ContainerStarted","Data":"2eb6970123f1e3c4b1ff6e78a8f1f150fcc713350b8aa64212cbf8342416ea16"} Dec 03 13:20:40 crc kubenswrapper[4666]: I1203 13:20:40.869814 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.86979392 podStartE2EDuration="2.86979392s" podCreationTimestamp="2025-12-03 13:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:20:40.863630994 +0000 UTC m=+4029.708592045" watchObservedRunningTime="2025-12-03 13:20:40.86979392 +0000 UTC m=+4029.714754971" Dec 03 13:20:41 crc kubenswrapper[4666]: I1203 13:20:41.854643 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1aee1c1a-28b6-4db6-b927-a484fa641914","Type":"ContainerStarted","Data":"1372eafd082352737d96a04bba01f065495fe58a7ab2b335468cb10a0efb7a6f"} Dec 03 13:20:41 crc kubenswrapper[4666]: I1203 13:20:41.878655 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.878637388 podStartE2EDuration="3.878637388s" podCreationTimestamp="2025-12-03 13:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:20:41.872337489 +0000 UTC m=+4030.717298540" watchObservedRunningTime="2025-12-03 13:20:41.878637388 +0000 UTC m=+4030.723598439" Dec 03 13:20:42 crc kubenswrapper[4666]: I1203 13:20:42.424551 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:20:42 crc kubenswrapper[4666]: E1203 13:20:42.424854 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:20:43 crc kubenswrapper[4666]: I1203 13:20:43.421604 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 03 13:20:45 crc kubenswrapper[4666]: I1203 13:20:45.914368 4666 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-686f87d7cd-k4tjz" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 03 13:20:45 crc kubenswrapper[4666]: I1203 13:20:45.914892 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:20:49 crc kubenswrapper[4666]: I1203 13:20:49.223140 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 03 13:20:49 crc kubenswrapper[4666]: I1203 13:20:49.297064 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:52.951379 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:52.955551 4666 generic.go:334] "Generic (PLEG): container finished" podID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerID="7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6" exitCode=137 Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:52.955616 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686f87d7cd-k4tjz" event={"ID":"ae41f444-5132-4df8-80dc-c80a6aea99f8","Type":"ContainerDied","Data":"7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6"} Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:52.955640 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686f87d7cd-k4tjz" event={"ID":"ae41f444-5132-4df8-80dc-c80a6aea99f8","Type":"ContainerDied","Data":"e0f741fbf25ac29b7108847f15a0ed1bde33e9f7d6cb16205d8d1cc40400275f"} Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:52.955674 4666 scope.go:117] "RemoveContainer" containerID="40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:52.959536 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686f87d7cd-k4tjz" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.072798 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-config-data\") pod \"ae41f444-5132-4df8-80dc-c80a6aea99f8\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.073222 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-combined-ca-bundle\") pod \"ae41f444-5132-4df8-80dc-c80a6aea99f8\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.073321 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-secret-key\") pod \"ae41f444-5132-4df8-80dc-c80a6aea99f8\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.073351 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-scripts\") pod \"ae41f444-5132-4df8-80dc-c80a6aea99f8\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.073375 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-tls-certs\") pod \"ae41f444-5132-4df8-80dc-c80a6aea99f8\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.073426 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae41f444-5132-4df8-80dc-c80a6aea99f8-logs\") pod \"ae41f444-5132-4df8-80dc-c80a6aea99f8\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.073602 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmlmv\" (UniqueName: \"kubernetes.io/projected/ae41f444-5132-4df8-80dc-c80a6aea99f8-kube-api-access-rmlmv\") pod \"ae41f444-5132-4df8-80dc-c80a6aea99f8\" (UID: \"ae41f444-5132-4df8-80dc-c80a6aea99f8\") " Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.074958 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae41f444-5132-4df8-80dc-c80a6aea99f8-logs" (OuterVolumeSpecName: "logs") pod "ae41f444-5132-4df8-80dc-c80a6aea99f8" (UID: "ae41f444-5132-4df8-80dc-c80a6aea99f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.080014 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ae41f444-5132-4df8-80dc-c80a6aea99f8" (UID: "ae41f444-5132-4df8-80dc-c80a6aea99f8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.080158 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae41f444-5132-4df8-80dc-c80a6aea99f8-kube-api-access-rmlmv" (OuterVolumeSpecName: "kube-api-access-rmlmv") pod "ae41f444-5132-4df8-80dc-c80a6aea99f8" (UID: "ae41f444-5132-4df8-80dc-c80a6aea99f8"). InnerVolumeSpecName "kube-api-access-rmlmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.103716 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-scripts" (OuterVolumeSpecName: "scripts") pod "ae41f444-5132-4df8-80dc-c80a6aea99f8" (UID: "ae41f444-5132-4df8-80dc-c80a6aea99f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.104254 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae41f444-5132-4df8-80dc-c80a6aea99f8" (UID: "ae41f444-5132-4df8-80dc-c80a6aea99f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.113277 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-config-data" (OuterVolumeSpecName: "config-data") pod "ae41f444-5132-4df8-80dc-c80a6aea99f8" (UID: "ae41f444-5132-4df8-80dc-c80a6aea99f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.130407 4666 scope.go:117] "RemoveContainer" containerID="7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.161693 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ae41f444-5132-4df8-80dc-c80a6aea99f8" (UID: "ae41f444-5132-4df8-80dc-c80a6aea99f8"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.178195 4666 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.178219 4666 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.178230 4666 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.178250 4666 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae41f444-5132-4df8-80dc-c80a6aea99f8-logs\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.178266 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmlmv\" (UniqueName: \"kubernetes.io/projected/ae41f444-5132-4df8-80dc-c80a6aea99f8-kube-api-access-rmlmv\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.178284 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae41f444-5132-4df8-80dc-c80a6aea99f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.178294 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae41f444-5132-4df8-80dc-c80a6aea99f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.188383 4666 scope.go:117] "RemoveContainer" containerID="40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf" Dec 03 13:20:53 crc kubenswrapper[4666]: E1203 13:20:53.189057 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf\": container with ID starting with 40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf not found: ID does not exist" containerID="40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.189128 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf"} err="failed to get container status \"40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf\": rpc error: code = NotFound desc = could not find container \"40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf\": container with ID starting with 40a6d65e74f7ba93a3dbe0625f2b467142990d835582b42262726633483f2caf not found: ID does not exist" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.189162 4666 scope.go:117] "RemoveContainer" containerID="7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6" Dec 03 13:20:53 crc kubenswrapper[4666]: E1203 13:20:53.189530 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6\": container with ID starting with 7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6 not found: ID does not exist" containerID="7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.189564 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6"} err="failed to get container status \"7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6\": rpc error: code = NotFound desc = could not find container \"7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6\": container with ID starting with 7d7cd9afb88e53521b964656f320db0cbd1822cf8262f5aef185ad854704eae6 not found: ID does not exist" Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.297747 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-686f87d7cd-k4tjz"] Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.305022 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-686f87d7cd-k4tjz"] Dec 03 13:20:53 crc kubenswrapper[4666]: I1203 13:20:53.435055 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" path="/var/lib/kubelet/pods/ae41f444-5132-4df8-80dc-c80a6aea99f8/volumes" Dec 03 13:20:55 crc kubenswrapper[4666]: I1203 13:20:55.423801 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:20:55 crc kubenswrapper[4666]: E1203 13:20:55.424713 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:21:00 crc kubenswrapper[4666]: I1203 13:21:00.716016 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 03 13:21:00 crc kubenswrapper[4666]: I1203 13:21:00.856661 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 03 13:21:04 crc kubenswrapper[4666]: I1203 13:21:04.145387 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 13:21:06 crc kubenswrapper[4666]: I1203 13:21:06.423921 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:21:06 crc kubenswrapper[4666]: E1203 13:21:06.424967 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:21:17 crc kubenswrapper[4666]: I1203 13:21:17.423997 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:21:17 crc kubenswrapper[4666]: E1203 13:21:17.424842 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:21:30 crc kubenswrapper[4666]: I1203 13:21:30.423618 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:21:30 crc kubenswrapper[4666]: E1203 13:21:30.424409 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:21:45 crc kubenswrapper[4666]: I1203 13:21:45.426168 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:21:45 crc kubenswrapper[4666]: E1203 13:21:45.426997 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.167937 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 13:21:55 crc kubenswrapper[4666]: E1203 13:21:55.169068 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon-log" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.169101 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon-log" Dec 03 13:21:55 crc kubenswrapper[4666]: E1203 13:21:55.169149 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.169159 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.169381 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon-log" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.169413 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae41f444-5132-4df8-80dc-c80a6aea99f8" containerName="horizon" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.170314 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.172885 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.173377 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bg7t9" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.173631 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.173930 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.228256 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372282 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372366 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67prk\" (UniqueName: \"kubernetes.io/projected/33fe3655-5d2c-48bd-8a4b-f436570d149c-kube-api-access-67prk\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372446 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372483 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372498 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372525 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372549 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372619 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.372648 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-config-data\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.474705 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.475031 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.475167 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.475383 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.475486 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.475646 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.475764 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.475773 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.476694 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.476795 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.476904 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-config-data\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.477010 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.477111 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67prk\" (UniqueName: \"kubernetes.io/projected/33fe3655-5d2c-48bd-8a4b-f436570d149c-kube-api-access-67prk\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.478559 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-config-data\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.480879 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.481084 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.492116 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.492629 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67prk\" (UniqueName: \"kubernetes.io/projected/33fe3655-5d2c-48bd-8a4b-f436570d149c-kube-api-access-67prk\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.506551 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " pod="openstack/tempest-tests-tempest" Dec 03 13:21:55 crc kubenswrapper[4666]: I1203 13:21:55.547168 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 13:21:56 crc kubenswrapper[4666]: I1203 13:21:56.014635 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 13:21:56 crc kubenswrapper[4666]: I1203 13:21:56.502325 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33fe3655-5d2c-48bd-8a4b-f436570d149c","Type":"ContainerStarted","Data":"be46491ee4efbcbbe7c67dfb3c82bc715016c0b5b2909ff57c9ed34cf314da82"} Dec 03 13:21:58 crc kubenswrapper[4666]: I1203 13:21:58.423343 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:21:58 crc kubenswrapper[4666]: E1203 13:21:58.423907 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:22:12 crc kubenswrapper[4666]: I1203 13:22:12.423804 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:22:12 crc kubenswrapper[4666]: E1203 13:22:12.425746 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:22:25 crc kubenswrapper[4666]: I1203 13:22:25.424706 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:22:25 crc kubenswrapper[4666]: E1203 13:22:25.425438 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:22:37 crc kubenswrapper[4666]: I1203 13:22:37.424153 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:22:37 crc kubenswrapper[4666]: E1203 13:22:37.424878 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:22:43 crc kubenswrapper[4666]: E1203 13:22:43.896443 4666 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 03 13:22:43 crc kubenswrapper[4666]: E1203 13:22:43.898145 4666 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67prk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(33fe3655-5d2c-48bd-8a4b-f436570d149c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 13:22:43 crc kubenswrapper[4666]: E1203 13:22:43.899429 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="33fe3655-5d2c-48bd-8a4b-f436570d149c" Dec 03 13:22:43 crc kubenswrapper[4666]: E1203 13:22:43.957338 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="33fe3655-5d2c-48bd-8a4b-f436570d149c" Dec 03 13:22:48 crc kubenswrapper[4666]: I1203 13:22:48.423777 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:22:48 crc kubenswrapper[4666]: I1203 13:22:48.997775 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"ea9d4689da969b1737e3b023e4a5645065744457b21831ef5b38db7303dab606"} Dec 03 13:22:55 crc kubenswrapper[4666]: I1203 13:22:55.830049 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 13:22:57 crc kubenswrapper[4666]: I1203 13:22:57.063374 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33fe3655-5d2c-48bd-8a4b-f436570d149c","Type":"ContainerStarted","Data":"207e1f4e81e314f28fecc8ef2308a893c13f2e05c03701ace0025274dbe29575"} Dec 03 13:22:57 crc kubenswrapper[4666]: I1203 13:22:57.091021 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.290011245 podStartE2EDuration="1m3.090999308s" podCreationTimestamp="2025-12-03 13:21:54 +0000 UTC" firstStartedPulling="2025-12-03 13:21:56.025791354 +0000 UTC m=+4104.870752405" lastFinishedPulling="2025-12-03 13:22:55.826779417 +0000 UTC m=+4164.671740468" observedRunningTime="2025-12-03 13:22:57.089831116 +0000 UTC m=+4165.934792167" watchObservedRunningTime="2025-12-03 13:22:57.090999308 +0000 UTC m=+4165.935960389" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.039055 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ntmqn"] Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.041699 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.062031 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntmqn"] Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.152730 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-utilities\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.152782 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-catalog-content\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.152943 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2slg2\" (UniqueName: \"kubernetes.io/projected/9fb3e370-9b1b-4e27-821f-8c89f4320211-kube-api-access-2slg2\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.254225 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-utilities\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.254287 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-catalog-content\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.254378 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2slg2\" (UniqueName: \"kubernetes.io/projected/9fb3e370-9b1b-4e27-821f-8c89f4320211-kube-api-access-2slg2\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.255056 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-utilities\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.255205 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-catalog-content\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.274897 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2slg2\" (UniqueName: \"kubernetes.io/projected/9fb3e370-9b1b-4e27-821f-8c89f4320211-kube-api-access-2slg2\") pod \"redhat-operators-ntmqn\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.429869 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:24:50 crc kubenswrapper[4666]: I1203 13:24:50.894456 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntmqn"] Dec 03 13:24:51 crc kubenswrapper[4666]: I1203 13:24:51.239765 4666 generic.go:334] "Generic (PLEG): container finished" podID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerID="19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083" exitCode=0 Dec 03 13:24:51 crc kubenswrapper[4666]: I1203 13:24:51.239810 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntmqn" event={"ID":"9fb3e370-9b1b-4e27-821f-8c89f4320211","Type":"ContainerDied","Data":"19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083"} Dec 03 13:24:51 crc kubenswrapper[4666]: I1203 13:24:51.239837 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntmqn" event={"ID":"9fb3e370-9b1b-4e27-821f-8c89f4320211","Type":"ContainerStarted","Data":"6a1662e056eaebf270489076eab524c8c01daeffa69d12f761c4d5f8f43db4b4"} Dec 03 13:24:51 crc kubenswrapper[4666]: I1203 13:24:51.245850 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:24:52 crc kubenswrapper[4666]: I1203 13:24:52.250838 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntmqn" event={"ID":"9fb3e370-9b1b-4e27-821f-8c89f4320211","Type":"ContainerStarted","Data":"4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad"} Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.276952 4666 generic.go:334] "Generic (PLEG): container finished" podID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerID="4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad" exitCode=0 Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.277022 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntmqn" event={"ID":"9fb3e370-9b1b-4e27-821f-8c89f4320211","Type":"ContainerDied","Data":"4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad"} Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.416772 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hwz4t"] Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.419251 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.437123 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwz4t"] Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.590876 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-utilities\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.590953 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7tbj\" (UniqueName: \"kubernetes.io/projected/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-kube-api-access-c7tbj\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.591011 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-catalog-content\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.693138 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7tbj\" (UniqueName: \"kubernetes.io/projected/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-kube-api-access-c7tbj\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.693271 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-catalog-content\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.693481 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-utilities\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.693792 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-catalog-content\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.693853 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-utilities\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:55 crc kubenswrapper[4666]: I1203 13:24:55.987718 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7tbj\" (UniqueName: \"kubernetes.io/projected/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-kube-api-access-c7tbj\") pod \"certified-operators-hwz4t\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:56 crc kubenswrapper[4666]: I1203 13:24:56.042004 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:24:56 crc kubenswrapper[4666]: I1203 13:24:56.631347 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwz4t"] Dec 03 13:24:56 crc kubenswrapper[4666]: W1203 13:24:56.634884 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c0e1e65_72f5_4cf9_bc67_47aa591dd94a.slice/crio-cca0d986a0629f256a8b5093c2f69706b3a306a2a3da296a503c54e0b7db2cff WatchSource:0}: Error finding container cca0d986a0629f256a8b5093c2f69706b3a306a2a3da296a503c54e0b7db2cff: Status 404 returned error can't find the container with id cca0d986a0629f256a8b5093c2f69706b3a306a2a3da296a503c54e0b7db2cff Dec 03 13:24:57 crc kubenswrapper[4666]: I1203 13:24:57.303576 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerID="77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238" exitCode=0 Dec 03 13:24:57 crc kubenswrapper[4666]: I1203 13:24:57.303650 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwz4t" event={"ID":"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a","Type":"ContainerDied","Data":"77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238"} Dec 03 13:24:57 crc kubenswrapper[4666]: I1203 13:24:57.303956 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwz4t" event={"ID":"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a","Type":"ContainerStarted","Data":"cca0d986a0629f256a8b5093c2f69706b3a306a2a3da296a503c54e0b7db2cff"} Dec 03 13:24:57 crc kubenswrapper[4666]: I1203 13:24:57.306910 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntmqn" event={"ID":"9fb3e370-9b1b-4e27-821f-8c89f4320211","Type":"ContainerStarted","Data":"50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b"} Dec 03 13:24:57 crc kubenswrapper[4666]: I1203 13:24:57.349991 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ntmqn" podStartSLOduration=2.560491381 podStartE2EDuration="7.349975626s" podCreationTimestamp="2025-12-03 13:24:50 +0000 UTC" firstStartedPulling="2025-12-03 13:24:51.245557486 +0000 UTC m=+4280.090518537" lastFinishedPulling="2025-12-03 13:24:56.035041731 +0000 UTC m=+4284.880002782" observedRunningTime="2025-12-03 13:24:57.347665734 +0000 UTC m=+4286.192626785" watchObservedRunningTime="2025-12-03 13:24:57.349975626 +0000 UTC m=+4286.194936677" Dec 03 13:24:59 crc kubenswrapper[4666]: I1203 13:24:59.325518 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwz4t" event={"ID":"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a","Type":"ContainerStarted","Data":"21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d"} Dec 03 13:25:00 crc kubenswrapper[4666]: I1203 13:25:00.335960 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerID="21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d" exitCode=0 Dec 03 13:25:00 crc kubenswrapper[4666]: I1203 13:25:00.336120 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwz4t" event={"ID":"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a","Type":"ContainerDied","Data":"21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d"} Dec 03 13:25:00 crc kubenswrapper[4666]: I1203 13:25:00.430555 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:25:00 crc kubenswrapper[4666]: I1203 13:25:00.432337 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:25:01 crc kubenswrapper[4666]: I1203 13:25:01.346936 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwz4t" event={"ID":"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a","Type":"ContainerStarted","Data":"5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096"} Dec 03 13:25:01 crc kubenswrapper[4666]: I1203 13:25:01.368905 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hwz4t" podStartSLOduration=2.922424026 podStartE2EDuration="6.368883288s" podCreationTimestamp="2025-12-03 13:24:55 +0000 UTC" firstStartedPulling="2025-12-03 13:24:57.305235242 +0000 UTC m=+4286.150196293" lastFinishedPulling="2025-12-03 13:25:00.751694504 +0000 UTC m=+4289.596655555" observedRunningTime="2025-12-03 13:25:01.36300675 +0000 UTC m=+4290.207967801" watchObservedRunningTime="2025-12-03 13:25:01.368883288 +0000 UTC m=+4290.213844359" Dec 03 13:25:01 crc kubenswrapper[4666]: I1203 13:25:01.481462 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ntmqn" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="registry-server" probeResult="failure" output=< Dec 03 13:25:01 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 13:25:01 crc kubenswrapper[4666]: > Dec 03 13:25:06 crc kubenswrapper[4666]: I1203 13:25:06.043789 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:25:06 crc kubenswrapper[4666]: I1203 13:25:06.044417 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:25:06 crc kubenswrapper[4666]: I1203 13:25:06.159396 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:25:06 crc kubenswrapper[4666]: I1203 13:25:06.448411 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:25:06 crc kubenswrapper[4666]: I1203 13:25:06.497610 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwz4t"] Dec 03 13:25:08 crc kubenswrapper[4666]: I1203 13:25:08.404452 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hwz4t" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="registry-server" containerID="cri-o://5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096" gracePeriod=2 Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.088470 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.272551 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-utilities\") pod \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.272699 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7tbj\" (UniqueName: \"kubernetes.io/projected/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-kube-api-access-c7tbj\") pod \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.272821 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-catalog-content\") pod \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\" (UID: \"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a\") " Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.273719 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-utilities" (OuterVolumeSpecName: "utilities") pod "5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" (UID: "5c0e1e65-72f5-4cf9-bc67-47aa591dd94a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.283761 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-kube-api-access-c7tbj" (OuterVolumeSpecName: "kube-api-access-c7tbj") pod "5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" (UID: "5c0e1e65-72f5-4cf9-bc67-47aa591dd94a"). InnerVolumeSpecName "kube-api-access-c7tbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.328327 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" (UID: "5c0e1e65-72f5-4cf9-bc67-47aa591dd94a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.375575 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.375953 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7tbj\" (UniqueName: \"kubernetes.io/projected/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-kube-api-access-c7tbj\") on node \"crc\" DevicePath \"\"" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.376028 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.415602 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerID="5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096" exitCode=0 Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.415651 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwz4t" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.415656 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwz4t" event={"ID":"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a","Type":"ContainerDied","Data":"5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096"} Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.415776 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwz4t" event={"ID":"5c0e1e65-72f5-4cf9-bc67-47aa591dd94a","Type":"ContainerDied","Data":"cca0d986a0629f256a8b5093c2f69706b3a306a2a3da296a503c54e0b7db2cff"} Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.415795 4666 scope.go:117] "RemoveContainer" containerID="5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.464598 4666 scope.go:117] "RemoveContainer" containerID="21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.468184 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwz4t"] Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.478644 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hwz4t"] Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.494509 4666 scope.go:117] "RemoveContainer" containerID="77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.533016 4666 scope.go:117] "RemoveContainer" containerID="5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096" Dec 03 13:25:09 crc kubenswrapper[4666]: E1203 13:25:09.533517 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096\": container with ID starting with 5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096 not found: ID does not exist" containerID="5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.533560 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096"} err="failed to get container status \"5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096\": rpc error: code = NotFound desc = could not find container \"5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096\": container with ID starting with 5e467acc4e05eb21e337db81c0ac3390e2417ecd1c1718e7990194218f9bf096 not found: ID does not exist" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.533589 4666 scope.go:117] "RemoveContainer" containerID="21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d" Dec 03 13:25:09 crc kubenswrapper[4666]: E1203 13:25:09.534003 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d\": container with ID starting with 21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d not found: ID does not exist" containerID="21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.534044 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d"} err="failed to get container status \"21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d\": rpc error: code = NotFound desc = could not find container \"21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d\": container with ID starting with 21746a4b0ed39dbf1490f1ff56ddea4caae99efe02d585a5c9f4d5c09242a69d not found: ID does not exist" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.534068 4666 scope.go:117] "RemoveContainer" containerID="77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238" Dec 03 13:25:09 crc kubenswrapper[4666]: E1203 13:25:09.534703 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238\": container with ID starting with 77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238 not found: ID does not exist" containerID="77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.534744 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238"} err="failed to get container status \"77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238\": rpc error: code = NotFound desc = could not find container \"77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238\": container with ID starting with 77915709f3ec44c624d615f8ec47ce1ea43898d1d0ee17133d08a1ae84058238 not found: ID does not exist" Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.865831 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:25:09 crc kubenswrapper[4666]: I1203 13:25:09.866256 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:25:11 crc kubenswrapper[4666]: I1203 13:25:11.440915 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" path="/var/lib/kubelet/pods/5c0e1e65-72f5-4cf9-bc67-47aa591dd94a/volumes" Dec 03 13:25:11 crc kubenswrapper[4666]: I1203 13:25:11.493063 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ntmqn" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="registry-server" probeResult="failure" output=< Dec 03 13:25:11 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 13:25:11 crc kubenswrapper[4666]: > Dec 03 13:25:20 crc kubenswrapper[4666]: I1203 13:25:20.477362 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:25:20 crc kubenswrapper[4666]: I1203 13:25:20.522817 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:25:21 crc kubenswrapper[4666]: I1203 13:25:21.242996 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntmqn"] Dec 03 13:25:21 crc kubenswrapper[4666]: I1203 13:25:21.529542 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ntmqn" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="registry-server" containerID="cri-o://50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b" gracePeriod=2 Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.370144 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.535141 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-catalog-content\") pod \"9fb3e370-9b1b-4e27-821f-8c89f4320211\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.536262 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-utilities\") pod \"9fb3e370-9b1b-4e27-821f-8c89f4320211\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.536512 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2slg2\" (UniqueName: \"kubernetes.io/projected/9fb3e370-9b1b-4e27-821f-8c89f4320211-kube-api-access-2slg2\") pod \"9fb3e370-9b1b-4e27-821f-8c89f4320211\" (UID: \"9fb3e370-9b1b-4e27-821f-8c89f4320211\") " Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.539602 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-utilities" (OuterVolumeSpecName: "utilities") pod "9fb3e370-9b1b-4e27-821f-8c89f4320211" (UID: "9fb3e370-9b1b-4e27-821f-8c89f4320211"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.553431 4666 generic.go:334] "Generic (PLEG): container finished" podID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerID="50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b" exitCode=0 Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.553483 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntmqn" event={"ID":"9fb3e370-9b1b-4e27-821f-8c89f4320211","Type":"ContainerDied","Data":"50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b"} Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.553521 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntmqn" event={"ID":"9fb3e370-9b1b-4e27-821f-8c89f4320211","Type":"ContainerDied","Data":"6a1662e056eaebf270489076eab524c8c01daeffa69d12f761c4d5f8f43db4b4"} Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.553543 4666 scope.go:117] "RemoveContainer" containerID="50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.553702 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntmqn" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.560520 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb3e370-9b1b-4e27-821f-8c89f4320211-kube-api-access-2slg2" (OuterVolumeSpecName: "kube-api-access-2slg2") pod "9fb3e370-9b1b-4e27-821f-8c89f4320211" (UID: "9fb3e370-9b1b-4e27-821f-8c89f4320211"). InnerVolumeSpecName "kube-api-access-2slg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.618486 4666 scope.go:117] "RemoveContainer" containerID="4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.639229 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.639367 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2slg2\" (UniqueName: \"kubernetes.io/projected/9fb3e370-9b1b-4e27-821f-8c89f4320211-kube-api-access-2slg2\") on node \"crc\" DevicePath \"\"" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.641859 4666 scope.go:117] "RemoveContainer" containerID="19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.671765 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fb3e370-9b1b-4e27-821f-8c89f4320211" (UID: "9fb3e370-9b1b-4e27-821f-8c89f4320211"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.681034 4666 scope.go:117] "RemoveContainer" containerID="50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b" Dec 03 13:25:22 crc kubenswrapper[4666]: E1203 13:25:22.681606 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b\": container with ID starting with 50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b not found: ID does not exist" containerID="50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.681695 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b"} err="failed to get container status \"50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b\": rpc error: code = NotFound desc = could not find container \"50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b\": container with ID starting with 50d0b04508a861a9774ea04f03f0b3b0d66a718ac60f28e2ccd21e49bf5af21b not found: ID does not exist" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.681766 4666 scope.go:117] "RemoveContainer" containerID="4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad" Dec 03 13:25:22 crc kubenswrapper[4666]: E1203 13:25:22.682208 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad\": container with ID starting with 4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad not found: ID does not exist" containerID="4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.682248 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad"} err="failed to get container status \"4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad\": rpc error: code = NotFound desc = could not find container \"4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad\": container with ID starting with 4503031625e14f3c7a93ef2f0b791dd0ddecba310bb3756f4563fb2210d5caad not found: ID does not exist" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.682274 4666 scope.go:117] "RemoveContainer" containerID="19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083" Dec 03 13:25:22 crc kubenswrapper[4666]: E1203 13:25:22.682730 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083\": container with ID starting with 19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083 not found: ID does not exist" containerID="19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.682755 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083"} err="failed to get container status \"19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083\": rpc error: code = NotFound desc = could not find container \"19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083\": container with ID starting with 19906f0be19c1a2ab4394418310c0c94116bd2225e22598a01d489849571b083 not found: ID does not exist" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.741359 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb3e370-9b1b-4e27-821f-8c89f4320211-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.886410 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntmqn"] Dec 03 13:25:22 crc kubenswrapper[4666]: I1203 13:25:22.896187 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ntmqn"] Dec 03 13:25:23 crc kubenswrapper[4666]: I1203 13:25:23.437258 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" path="/var/lib/kubelet/pods/9fb3e370-9b1b-4e27-821f-8c89f4320211/volumes" Dec 03 13:25:39 crc kubenswrapper[4666]: I1203 13:25:39.866040 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:25:39 crc kubenswrapper[4666]: I1203 13:25:39.866574 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:26:09 crc kubenswrapper[4666]: I1203 13:26:09.866206 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:26:09 crc kubenswrapper[4666]: I1203 13:26:09.866710 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:26:09 crc kubenswrapper[4666]: I1203 13:26:09.866754 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:26:09 crc kubenswrapper[4666]: I1203 13:26:09.867488 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea9d4689da969b1737e3b023e4a5645065744457b21831ef5b38db7303dab606"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:26:09 crc kubenswrapper[4666]: I1203 13:26:09.867539 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://ea9d4689da969b1737e3b023e4a5645065744457b21831ef5b38db7303dab606" gracePeriod=600 Dec 03 13:26:10 crc kubenswrapper[4666]: I1203 13:26:10.982250 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="ea9d4689da969b1737e3b023e4a5645065744457b21831ef5b38db7303dab606" exitCode=0 Dec 03 13:26:10 crc kubenswrapper[4666]: I1203 13:26:10.982330 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"ea9d4689da969b1737e3b023e4a5645065744457b21831ef5b38db7303dab606"} Dec 03 13:26:10 crc kubenswrapper[4666]: I1203 13:26:10.982759 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad"} Dec 03 13:26:10 crc kubenswrapper[4666]: I1203 13:26:10.982780 4666 scope.go:117] "RemoveContainer" containerID="d04bcede28b1153b3e068e4163250053be199a8f2f8018b9897f06ed058cfe53" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:06.999052 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8f57m"] Dec 03 13:28:07 crc kubenswrapper[4666]: E1203 13:28:07.000024 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="registry-server" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000039 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="registry-server" Dec 03 13:28:07 crc kubenswrapper[4666]: E1203 13:28:07.000049 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="extract-content" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000054 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="extract-content" Dec 03 13:28:07 crc kubenswrapper[4666]: E1203 13:28:07.000072 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="extract-content" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000078 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="extract-content" Dec 03 13:28:07 crc kubenswrapper[4666]: E1203 13:28:07.000114 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="extract-utilities" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000120 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="extract-utilities" Dec 03 13:28:07 crc kubenswrapper[4666]: E1203 13:28:07.000135 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="extract-utilities" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000140 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="extract-utilities" Dec 03 13:28:07 crc kubenswrapper[4666]: E1203 13:28:07.000149 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="registry-server" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000155 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="registry-server" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000349 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb3e370-9b1b-4e27-821f-8c89f4320211" containerName="registry-server" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.000363 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0e1e65-72f5-4cf9-bc67-47aa591dd94a" containerName="registry-server" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.001742 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.020871 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8f57m"] Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.072124 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-catalog-content\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.072725 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-utilities\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.174110 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-utilities\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.174421 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hfc6\" (UniqueName: \"kubernetes.io/projected/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-kube-api-access-5hfc6\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.174461 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-catalog-content\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.174575 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-utilities\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.174728 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-catalog-content\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.277063 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hfc6\" (UniqueName: \"kubernetes.io/projected/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-kube-api-access-5hfc6\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.298902 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hfc6\" (UniqueName: \"kubernetes.io/projected/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-kube-api-access-5hfc6\") pod \"community-operators-8f57m\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:07 crc kubenswrapper[4666]: I1203 13:28:07.329883 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:08 crc kubenswrapper[4666]: I1203 13:28:08.063325 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8f57m"] Dec 03 13:28:09 crc kubenswrapper[4666]: I1203 13:28:09.008901 4666 generic.go:334] "Generic (PLEG): container finished" podID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerID="0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f" exitCode=0 Dec 03 13:28:09 crc kubenswrapper[4666]: I1203 13:28:09.009405 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8f57m" event={"ID":"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962","Type":"ContainerDied","Data":"0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f"} Dec 03 13:28:09 crc kubenswrapper[4666]: I1203 13:28:09.009431 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8f57m" event={"ID":"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962","Type":"ContainerStarted","Data":"ab40689072e56d1baded986ca3395d8d5314cdad827222d82e2aeeed8be4ea5b"} Dec 03 13:28:10 crc kubenswrapper[4666]: I1203 13:28:10.025491 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8f57m" event={"ID":"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962","Type":"ContainerStarted","Data":"71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e"} Dec 03 13:28:11 crc kubenswrapper[4666]: I1203 13:28:11.042154 4666 generic.go:334] "Generic (PLEG): container finished" podID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerID="71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e" exitCode=0 Dec 03 13:28:11 crc kubenswrapper[4666]: I1203 13:28:11.042200 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8f57m" event={"ID":"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962","Type":"ContainerDied","Data":"71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e"} Dec 03 13:28:12 crc kubenswrapper[4666]: I1203 13:28:12.051762 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8f57m" event={"ID":"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962","Type":"ContainerStarted","Data":"f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af"} Dec 03 13:28:12 crc kubenswrapper[4666]: I1203 13:28:12.076016 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8f57m" podStartSLOduration=3.579689403 podStartE2EDuration="6.075996959s" podCreationTimestamp="2025-12-03 13:28:06 +0000 UTC" firstStartedPulling="2025-12-03 13:28:09.011545729 +0000 UTC m=+4477.856506790" lastFinishedPulling="2025-12-03 13:28:11.507853295 +0000 UTC m=+4480.352814346" observedRunningTime="2025-12-03 13:28:12.070358337 +0000 UTC m=+4480.915319398" watchObservedRunningTime="2025-12-03 13:28:12.075996959 +0000 UTC m=+4480.920958040" Dec 03 13:28:17 crc kubenswrapper[4666]: I1203 13:28:17.330443 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:17 crc kubenswrapper[4666]: I1203 13:28:17.330998 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:17 crc kubenswrapper[4666]: I1203 13:28:17.625625 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:18 crc kubenswrapper[4666]: I1203 13:28:18.146487 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:18 crc kubenswrapper[4666]: I1203 13:28:18.196014 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8f57m"] Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.120881 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8f57m" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="registry-server" containerID="cri-o://f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af" gracePeriod=2 Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.708563 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.858381 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hfc6\" (UniqueName: \"kubernetes.io/projected/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-kube-api-access-5hfc6\") pod \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.858642 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-catalog-content\") pod \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.858736 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-utilities\") pod \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\" (UID: \"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962\") " Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.859992 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-utilities" (OuterVolumeSpecName: "utilities") pod "8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" (UID: "8e53b0ee-6e13-471e-8ba8-f6b02b5f9962"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.873458 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-kube-api-access-5hfc6" (OuterVolumeSpecName: "kube-api-access-5hfc6") pod "8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" (UID: "8e53b0ee-6e13-471e-8ba8-f6b02b5f9962"). InnerVolumeSpecName "kube-api-access-5hfc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.914710 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" (UID: "8e53b0ee-6e13-471e-8ba8-f6b02b5f9962"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.961562 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.961604 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:28:20 crc kubenswrapper[4666]: I1203 13:28:20.961614 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hfc6\" (UniqueName: \"kubernetes.io/projected/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962-kube-api-access-5hfc6\") on node \"crc\" DevicePath \"\"" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.133633 4666 generic.go:334] "Generic (PLEG): container finished" podID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerID="f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af" exitCode=0 Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.133681 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8f57m" event={"ID":"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962","Type":"ContainerDied","Data":"f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af"} Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.133716 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8f57m" event={"ID":"8e53b0ee-6e13-471e-8ba8-f6b02b5f9962","Type":"ContainerDied","Data":"ab40689072e56d1baded986ca3395d8d5314cdad827222d82e2aeeed8be4ea5b"} Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.133735 4666 scope.go:117] "RemoveContainer" containerID="f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.133742 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8f57m" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.156373 4666 scope.go:117] "RemoveContainer" containerID="71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.176407 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8f57m"] Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.184288 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8f57m"] Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.188816 4666 scope.go:117] "RemoveContainer" containerID="0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.224420 4666 scope.go:117] "RemoveContainer" containerID="f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af" Dec 03 13:28:21 crc kubenswrapper[4666]: E1203 13:28:21.225571 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af\": container with ID starting with f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af not found: ID does not exist" containerID="f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.225615 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af"} err="failed to get container status \"f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af\": rpc error: code = NotFound desc = could not find container \"f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af\": container with ID starting with f41adba76b3b313f52464ab62c8df3bc20941e141cc3d1a9838d42c1a60091af not found: ID does not exist" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.225642 4666 scope.go:117] "RemoveContainer" containerID="71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e" Dec 03 13:28:21 crc kubenswrapper[4666]: E1203 13:28:21.226124 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e\": container with ID starting with 71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e not found: ID does not exist" containerID="71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.226160 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e"} err="failed to get container status \"71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e\": rpc error: code = NotFound desc = could not find container \"71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e\": container with ID starting with 71bc98838b294fbd203e92558dae3fde270f1d5f00a0bce5ef419b0864e7748e not found: ID does not exist" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.226185 4666 scope.go:117] "RemoveContainer" containerID="0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f" Dec 03 13:28:21 crc kubenswrapper[4666]: E1203 13:28:21.226483 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f\": container with ID starting with 0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f not found: ID does not exist" containerID="0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.226507 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f"} err="failed to get container status \"0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f\": rpc error: code = NotFound desc = could not find container \"0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f\": container with ID starting with 0d27522ae9e267f7693614df8377f580b1eb33a615b5ed7bd007e048b4326b2f not found: ID does not exist" Dec 03 13:28:21 crc kubenswrapper[4666]: I1203 13:28:21.437149 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" path="/var/lib/kubelet/pods/8e53b0ee-6e13-471e-8ba8-f6b02b5f9962/volumes" Dec 03 13:28:39 crc kubenswrapper[4666]: I1203 13:28:39.866566 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:28:39 crc kubenswrapper[4666]: I1203 13:28:39.867051 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:29:09 crc kubenswrapper[4666]: I1203 13:29:09.865776 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:29:09 crc kubenswrapper[4666]: I1203 13:29:09.866592 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:29:37 crc kubenswrapper[4666]: I1203 13:29:37.045905 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-b7f3-account-create-update-f6m25"] Dec 03 13:29:37 crc kubenswrapper[4666]: I1203 13:29:37.057192 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-rzxg8"] Dec 03 13:29:37 crc kubenswrapper[4666]: I1203 13:29:37.066437 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-b7f3-account-create-update-f6m25"] Dec 03 13:29:37 crc kubenswrapper[4666]: I1203 13:29:37.075878 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-rzxg8"] Dec 03 13:29:37 crc kubenswrapper[4666]: I1203 13:29:37.434994 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38105880-8017-49b9-a922-d936adc03946" path="/var/lib/kubelet/pods/38105880-8017-49b9-a922-d936adc03946/volumes" Dec 03 13:29:37 crc kubenswrapper[4666]: I1203 13:29:37.435957 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a40df46-c8e9-496a-af5c-3066bafd781b" path="/var/lib/kubelet/pods/6a40df46-c8e9-496a-af5c-3066bafd781b/volumes" Dec 03 13:29:39 crc kubenswrapper[4666]: I1203 13:29:39.866369 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:29:39 crc kubenswrapper[4666]: I1203 13:29:39.866870 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:29:39 crc kubenswrapper[4666]: I1203 13:29:39.866914 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:29:39 crc kubenswrapper[4666]: I1203 13:29:39.867626 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:29:39 crc kubenswrapper[4666]: I1203 13:29:39.867670 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" gracePeriod=600 Dec 03 13:29:39 crc kubenswrapper[4666]: E1203 13:29:39.992119 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:29:40 crc kubenswrapper[4666]: I1203 13:29:40.823642 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" exitCode=0 Dec 03 13:29:40 crc kubenswrapper[4666]: I1203 13:29:40.823712 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad"} Dec 03 13:29:40 crc kubenswrapper[4666]: I1203 13:29:40.823981 4666 scope.go:117] "RemoveContainer" containerID="ea9d4689da969b1737e3b023e4a5645065744457b21831ef5b38db7303dab606" Dec 03 13:29:40 crc kubenswrapper[4666]: I1203 13:29:40.824665 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:29:40 crc kubenswrapper[4666]: E1203 13:29:40.824923 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:29:46 crc kubenswrapper[4666]: I1203 13:29:46.974402 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2hcln"] Dec 03 13:29:46 crc kubenswrapper[4666]: E1203 13:29:46.975279 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="registry-server" Dec 03 13:29:46 crc kubenswrapper[4666]: I1203 13:29:46.975292 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="registry-server" Dec 03 13:29:46 crc kubenswrapper[4666]: E1203 13:29:46.975306 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="extract-content" Dec 03 13:29:46 crc kubenswrapper[4666]: I1203 13:29:46.975312 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="extract-content" Dec 03 13:29:46 crc kubenswrapper[4666]: E1203 13:29:46.975343 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="extract-utilities" Dec 03 13:29:46 crc kubenswrapper[4666]: I1203 13:29:46.975350 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="extract-utilities" Dec 03 13:29:46 crc kubenswrapper[4666]: I1203 13:29:46.975597 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e53b0ee-6e13-471e-8ba8-f6b02b5f9962" containerName="registry-server" Dec 03 13:29:46 crc kubenswrapper[4666]: I1203 13:29:46.977329 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:46 crc kubenswrapper[4666]: I1203 13:29:46.988037 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hcln"] Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.153942 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztsrq\" (UniqueName: \"kubernetes.io/projected/e98c3a75-8f23-4662-8d03-cfd673fb8f97-kube-api-access-ztsrq\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.154049 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-catalog-content\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.154361 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-utilities\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.285010 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-utilities\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.285198 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsrq\" (UniqueName: \"kubernetes.io/projected/e98c3a75-8f23-4662-8d03-cfd673fb8f97-kube-api-access-ztsrq\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.285275 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-catalog-content\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.285598 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-utilities\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.285663 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-catalog-content\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.307267 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztsrq\" (UniqueName: \"kubernetes.io/projected/e98c3a75-8f23-4662-8d03-cfd673fb8f97-kube-api-access-ztsrq\") pod \"redhat-marketplace-2hcln\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:47 crc kubenswrapper[4666]: I1203 13:29:47.597026 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:48 crc kubenswrapper[4666]: I1203 13:29:48.437860 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hcln"] Dec 03 13:29:48 crc kubenswrapper[4666]: I1203 13:29:48.897796 4666 generic.go:334] "Generic (PLEG): container finished" podID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerID="a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8" exitCode=0 Dec 03 13:29:48 crc kubenswrapper[4666]: I1203 13:29:48.897847 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hcln" event={"ID":"e98c3a75-8f23-4662-8d03-cfd673fb8f97","Type":"ContainerDied","Data":"a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8"} Dec 03 13:29:48 crc kubenswrapper[4666]: I1203 13:29:48.898075 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hcln" event={"ID":"e98c3a75-8f23-4662-8d03-cfd673fb8f97","Type":"ContainerStarted","Data":"cc5d27b99116545f276de85b00b0027bf60c5b812dcabe3429f767b062100de6"} Dec 03 13:29:50 crc kubenswrapper[4666]: I1203 13:29:50.915409 4666 generic.go:334] "Generic (PLEG): container finished" podID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerID="d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40" exitCode=0 Dec 03 13:29:50 crc kubenswrapper[4666]: I1203 13:29:50.915464 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hcln" event={"ID":"e98c3a75-8f23-4662-8d03-cfd673fb8f97","Type":"ContainerDied","Data":"d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40"} Dec 03 13:29:51 crc kubenswrapper[4666]: I1203 13:29:51.431228 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:29:51 crc kubenswrapper[4666]: E1203 13:29:51.431518 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:29:51 crc kubenswrapper[4666]: I1203 13:29:51.925827 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hcln" event={"ID":"e98c3a75-8f23-4662-8d03-cfd673fb8f97","Type":"ContainerStarted","Data":"b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee"} Dec 03 13:29:51 crc kubenswrapper[4666]: I1203 13:29:51.951944 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2hcln" podStartSLOduration=3.520507056 podStartE2EDuration="5.951923145s" podCreationTimestamp="2025-12-03 13:29:46 +0000 UTC" firstStartedPulling="2025-12-03 13:29:48.899950472 +0000 UTC m=+4577.744911533" lastFinishedPulling="2025-12-03 13:29:51.331366561 +0000 UTC m=+4580.176327622" observedRunningTime="2025-12-03 13:29:51.950732963 +0000 UTC m=+4580.795694044" watchObservedRunningTime="2025-12-03 13:29:51.951923145 +0000 UTC m=+4580.796884206" Dec 03 13:29:53 crc kubenswrapper[4666]: I1203 13:29:53.453538 4666 scope.go:117] "RemoveContainer" containerID="06854ba650ff3d7b14f2a8d2bc15ec70324ea120d741bbe4c27aa14d3b77b3ae" Dec 03 13:29:53 crc kubenswrapper[4666]: I1203 13:29:53.478830 4666 scope.go:117] "RemoveContainer" containerID="459b69c0e6ac76c45e4b716dac7238ac9ba1da345d348803f5987c0f4a7455ca" Dec 03 13:29:57 crc kubenswrapper[4666]: I1203 13:29:57.597230 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:57 crc kubenswrapper[4666]: I1203 13:29:57.597743 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:58 crc kubenswrapper[4666]: I1203 13:29:58.031272 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:58 crc kubenswrapper[4666]: I1203 13:29:58.087898 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:29:58 crc kubenswrapper[4666]: I1203 13:29:58.761426 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hcln"] Dec 03 13:29:59 crc kubenswrapper[4666]: I1203 13:29:59.998524 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2hcln" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="registry-server" containerID="cri-o://b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee" gracePeriod=2 Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.187548 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h"] Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.188783 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.190671 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.190997 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.200032 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h"] Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.267255 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1277e9ce-d1ea-458a-970b-dd65577bae7f-config-volume\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.267393 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1277e9ce-d1ea-458a-970b-dd65577bae7f-secret-volume\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.267462 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzw8\" (UniqueName: \"kubernetes.io/projected/1277e9ce-d1ea-458a-970b-dd65577bae7f-kube-api-access-jwzw8\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.369838 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1277e9ce-d1ea-458a-970b-dd65577bae7f-secret-volume\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.370260 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzw8\" (UniqueName: \"kubernetes.io/projected/1277e9ce-d1ea-458a-970b-dd65577bae7f-kube-api-access-jwzw8\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.370309 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1277e9ce-d1ea-458a-970b-dd65577bae7f-config-volume\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.371256 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1277e9ce-d1ea-458a-970b-dd65577bae7f-config-volume\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.381809 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1277e9ce-d1ea-458a-970b-dd65577bae7f-secret-volume\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.389780 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzw8\" (UniqueName: \"kubernetes.io/projected/1277e9ce-d1ea-458a-970b-dd65577bae7f-kube-api-access-jwzw8\") pod \"collect-profiles-29412810-k4v6h\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.514896 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.652506 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.679824 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-utilities\") pod \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.679929 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-catalog-content\") pod \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.680137 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztsrq\" (UniqueName: \"kubernetes.io/projected/e98c3a75-8f23-4662-8d03-cfd673fb8f97-kube-api-access-ztsrq\") pod \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\" (UID: \"e98c3a75-8f23-4662-8d03-cfd673fb8f97\") " Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.681694 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-utilities" (OuterVolumeSpecName: "utilities") pod "e98c3a75-8f23-4662-8d03-cfd673fb8f97" (UID: "e98c3a75-8f23-4662-8d03-cfd673fb8f97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.693288 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e98c3a75-8f23-4662-8d03-cfd673fb8f97-kube-api-access-ztsrq" (OuterVolumeSpecName: "kube-api-access-ztsrq") pod "e98c3a75-8f23-4662-8d03-cfd673fb8f97" (UID: "e98c3a75-8f23-4662-8d03-cfd673fb8f97"). InnerVolumeSpecName "kube-api-access-ztsrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.717648 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e98c3a75-8f23-4662-8d03-cfd673fb8f97" (UID: "e98c3a75-8f23-4662-8d03-cfd673fb8f97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.783555 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.783602 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztsrq\" (UniqueName: \"kubernetes.io/projected/e98c3a75-8f23-4662-8d03-cfd673fb8f97-kube-api-access-ztsrq\") on node \"crc\" DevicePath \"\"" Dec 03 13:30:00 crc kubenswrapper[4666]: I1203 13:30:00.783617 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e98c3a75-8f23-4662-8d03-cfd673fb8f97-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.003290 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h"] Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.010434 4666 generic.go:334] "Generic (PLEG): container finished" podID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerID="b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee" exitCode=0 Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.010472 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hcln" event={"ID":"e98c3a75-8f23-4662-8d03-cfd673fb8f97","Type":"ContainerDied","Data":"b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee"} Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.010512 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hcln" event={"ID":"e98c3a75-8f23-4662-8d03-cfd673fb8f97","Type":"ContainerDied","Data":"cc5d27b99116545f276de85b00b0027bf60c5b812dcabe3429f767b062100de6"} Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.010530 4666 scope.go:117] "RemoveContainer" containerID="b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.010672 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hcln" Dec 03 13:30:01 crc kubenswrapper[4666]: W1203 13:30:01.014284 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1277e9ce_d1ea_458a_970b_dd65577bae7f.slice/crio-b676be94327a7f8054e1284a2e9a297ca95fe64422a1903d8ea02b955fa11971 WatchSource:0}: Error finding container b676be94327a7f8054e1284a2e9a297ca95fe64422a1903d8ea02b955fa11971: Status 404 returned error can't find the container with id b676be94327a7f8054e1284a2e9a297ca95fe64422a1903d8ea02b955fa11971 Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.047185 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hcln"] Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.052515 4666 scope.go:117] "RemoveContainer" containerID="d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.055935 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hcln"] Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.080275 4666 scope.go:117] "RemoveContainer" containerID="a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.103726 4666 scope.go:117] "RemoveContainer" containerID="b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee" Dec 03 13:30:01 crc kubenswrapper[4666]: E1203 13:30:01.104257 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee\": container with ID starting with b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee not found: ID does not exist" containerID="b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.104308 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee"} err="failed to get container status \"b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee\": rpc error: code = NotFound desc = could not find container \"b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee\": container with ID starting with b52af1b05782d273699613794c2ecce14c49fab214ef4fb7cf99dbeb21e45fee not found: ID does not exist" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.104337 4666 scope.go:117] "RemoveContainer" containerID="d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40" Dec 03 13:30:01 crc kubenswrapper[4666]: E1203 13:30:01.104833 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40\": container with ID starting with d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40 not found: ID does not exist" containerID="d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.104882 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40"} err="failed to get container status \"d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40\": rpc error: code = NotFound desc = could not find container \"d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40\": container with ID starting with d5d494fb04dbf704a6948b8cfcc5f14df3e1e00335899a726e90c9ed12c3ce40 not found: ID does not exist" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.104904 4666 scope.go:117] "RemoveContainer" containerID="a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8" Dec 03 13:30:01 crc kubenswrapper[4666]: E1203 13:30:01.105254 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8\": container with ID starting with a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8 not found: ID does not exist" containerID="a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.105274 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8"} err="failed to get container status \"a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8\": rpc error: code = NotFound desc = could not find container \"a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8\": container with ID starting with a33767d126b89e83cba8d5cf74dd657f4e028bb124ab473e6903ae2c3e55c1d8 not found: ID does not exist" Dec 03 13:30:01 crc kubenswrapper[4666]: I1203 13:30:01.459221 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" path="/var/lib/kubelet/pods/e98c3a75-8f23-4662-8d03-cfd673fb8f97/volumes" Dec 03 13:30:02 crc kubenswrapper[4666]: I1203 13:30:02.019367 4666 generic.go:334] "Generic (PLEG): container finished" podID="1277e9ce-d1ea-458a-970b-dd65577bae7f" containerID="b6b4cb5e8102bd64a514fe54b96dc3a033ed5a8258a594fe1027fc90a909ab85" exitCode=0 Dec 03 13:30:02 crc kubenswrapper[4666]: I1203 13:30:02.019463 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" event={"ID":"1277e9ce-d1ea-458a-970b-dd65577bae7f","Type":"ContainerDied","Data":"b6b4cb5e8102bd64a514fe54b96dc3a033ed5a8258a594fe1027fc90a909ab85"} Dec 03 13:30:02 crc kubenswrapper[4666]: I1203 13:30:02.019647 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" event={"ID":"1277e9ce-d1ea-458a-970b-dd65577bae7f","Type":"ContainerStarted","Data":"b676be94327a7f8054e1284a2e9a297ca95fe64422a1903d8ea02b955fa11971"} Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.527176 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.639996 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1277e9ce-d1ea-458a-970b-dd65577bae7f-config-volume\") pod \"1277e9ce-d1ea-458a-970b-dd65577bae7f\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.640160 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1277e9ce-d1ea-458a-970b-dd65577bae7f-secret-volume\") pod \"1277e9ce-d1ea-458a-970b-dd65577bae7f\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.640235 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzw8\" (UniqueName: \"kubernetes.io/projected/1277e9ce-d1ea-458a-970b-dd65577bae7f-kube-api-access-jwzw8\") pod \"1277e9ce-d1ea-458a-970b-dd65577bae7f\" (UID: \"1277e9ce-d1ea-458a-970b-dd65577bae7f\") " Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.640702 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1277e9ce-d1ea-458a-970b-dd65577bae7f-config-volume" (OuterVolumeSpecName: "config-volume") pod "1277e9ce-d1ea-458a-970b-dd65577bae7f" (UID: "1277e9ce-d1ea-458a-970b-dd65577bae7f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.646781 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1277e9ce-d1ea-458a-970b-dd65577bae7f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1277e9ce-d1ea-458a-970b-dd65577bae7f" (UID: "1277e9ce-d1ea-458a-970b-dd65577bae7f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.646817 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1277e9ce-d1ea-458a-970b-dd65577bae7f-kube-api-access-jwzw8" (OuterVolumeSpecName: "kube-api-access-jwzw8") pod "1277e9ce-d1ea-458a-970b-dd65577bae7f" (UID: "1277e9ce-d1ea-458a-970b-dd65577bae7f"). InnerVolumeSpecName "kube-api-access-jwzw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.742202 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1277e9ce-d1ea-458a-970b-dd65577bae7f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.742492 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1277e9ce-d1ea-458a-970b-dd65577bae7f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:30:03 crc kubenswrapper[4666]: I1203 13:30:03.742503 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwzw8\" (UniqueName: \"kubernetes.io/projected/1277e9ce-d1ea-458a-970b-dd65577bae7f-kube-api-access-jwzw8\") on node \"crc\" DevicePath \"\"" Dec 03 13:30:04 crc kubenswrapper[4666]: I1203 13:30:04.039489 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" event={"ID":"1277e9ce-d1ea-458a-970b-dd65577bae7f","Type":"ContainerDied","Data":"b676be94327a7f8054e1284a2e9a297ca95fe64422a1903d8ea02b955fa11971"} Dec 03 13:30:04 crc kubenswrapper[4666]: I1203 13:30:04.039547 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b676be94327a7f8054e1284a2e9a297ca95fe64422a1903d8ea02b955fa11971" Dec 03 13:30:04 crc kubenswrapper[4666]: I1203 13:30:04.039685 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412810-k4v6h" Dec 03 13:30:04 crc kubenswrapper[4666]: I1203 13:30:04.604405 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs"] Dec 03 13:30:04 crc kubenswrapper[4666]: I1203 13:30:04.613228 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-5thhs"] Dec 03 13:30:05 crc kubenswrapper[4666]: I1203 13:30:05.438017 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f1cc55-1f71-490b-8def-9902e96f803a" path="/var/lib/kubelet/pods/a9f1cc55-1f71-490b-8def-9902e96f803a/volumes" Dec 03 13:30:06 crc kubenswrapper[4666]: I1203 13:30:06.425734 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:30:06 crc kubenswrapper[4666]: E1203 13:30:06.426354 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:30:14 crc kubenswrapper[4666]: I1203 13:30:14.038914 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-m76w2"] Dec 03 13:30:14 crc kubenswrapper[4666]: I1203 13:30:14.045919 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-m76w2"] Dec 03 13:30:15 crc kubenswrapper[4666]: I1203 13:30:15.436401 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd740ba-30fe-4c8c-9a36-9d1c56170bb3" path="/var/lib/kubelet/pods/1fd740ba-30fe-4c8c-9a36-9d1c56170bb3/volumes" Dec 03 13:30:17 crc kubenswrapper[4666]: I1203 13:30:17.423474 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:30:17 crc kubenswrapper[4666]: E1203 13:30:17.424895 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:30:32 crc kubenswrapper[4666]: I1203 13:30:32.424562 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:30:32 crc kubenswrapper[4666]: E1203 13:30:32.425555 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:30:47 crc kubenswrapper[4666]: I1203 13:30:47.423355 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:30:47 crc kubenswrapper[4666]: E1203 13:30:47.424151 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:30:53 crc kubenswrapper[4666]: I1203 13:30:53.587423 4666 scope.go:117] "RemoveContainer" containerID="c07add67fbe553b507eb70160f5e6481f96a10cb8cc0f8b64b7fbeb19570b012" Dec 03 13:30:53 crc kubenswrapper[4666]: I1203 13:30:53.612746 4666 scope.go:117] "RemoveContainer" containerID="5ec4cfea7301a32be660a94ab5fbcf166ae3a6a063e235c19d7f31d6d1ab318e" Dec 03 13:31:02 crc kubenswrapper[4666]: I1203 13:31:02.424540 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:31:02 crc kubenswrapper[4666]: E1203 13:31:02.425741 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:31:14 crc kubenswrapper[4666]: I1203 13:31:14.424041 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:31:14 crc kubenswrapper[4666]: E1203 13:31:14.424821 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:31:27 crc kubenswrapper[4666]: I1203 13:31:27.425780 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:31:27 crc kubenswrapper[4666]: E1203 13:31:27.426456 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:31:40 crc kubenswrapper[4666]: I1203 13:31:40.423797 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:31:40 crc kubenswrapper[4666]: E1203 13:31:40.424615 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:31:52 crc kubenswrapper[4666]: I1203 13:31:52.425262 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:31:52 crc kubenswrapper[4666]: E1203 13:31:52.426237 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:32:04 crc kubenswrapper[4666]: I1203 13:32:04.423731 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:32:04 crc kubenswrapper[4666]: E1203 13:32:04.424538 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:32:15 crc kubenswrapper[4666]: I1203 13:32:15.429261 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:32:15 crc kubenswrapper[4666]: E1203 13:32:15.431344 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:32:28 crc kubenswrapper[4666]: I1203 13:32:28.424425 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:32:28 crc kubenswrapper[4666]: E1203 13:32:28.425488 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:32:42 crc kubenswrapper[4666]: I1203 13:32:42.423709 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:32:42 crc kubenswrapper[4666]: E1203 13:32:42.424707 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:32:57 crc kubenswrapper[4666]: I1203 13:32:57.424283 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:32:57 crc kubenswrapper[4666]: E1203 13:32:57.425345 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:33:12 crc kubenswrapper[4666]: I1203 13:33:12.425686 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:33:12 crc kubenswrapper[4666]: E1203 13:33:12.426633 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:33:26 crc kubenswrapper[4666]: I1203 13:33:26.424751 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:33:26 crc kubenswrapper[4666]: E1203 13:33:26.426292 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:33:38 crc kubenswrapper[4666]: I1203 13:33:38.423894 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:33:38 crc kubenswrapper[4666]: E1203 13:33:38.425147 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:33:51 crc kubenswrapper[4666]: I1203 13:33:51.434707 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:33:51 crc kubenswrapper[4666]: E1203 13:33:51.435525 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:34:05 crc kubenswrapper[4666]: I1203 13:34:05.424417 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:34:05 crc kubenswrapper[4666]: E1203 13:34:05.425912 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:34:17 crc kubenswrapper[4666]: I1203 13:34:17.423948 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:34:17 crc kubenswrapper[4666]: E1203 13:34:17.424735 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:34:28 crc kubenswrapper[4666]: I1203 13:34:28.424620 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:34:28 crc kubenswrapper[4666]: E1203 13:34:28.425611 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:34:43 crc kubenswrapper[4666]: I1203 13:34:43.423814 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:34:43 crc kubenswrapper[4666]: I1203 13:34:43.650533 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"6a464ae0a9428614165c6fb33b80d1aafcc58c94ae1f6920f040bcb85b55acff"} Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.471289 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-twp6d"] Dec 03 13:35:31 crc kubenswrapper[4666]: E1203 13:35:31.472381 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="extract-content" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.472403 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="extract-content" Dec 03 13:35:31 crc kubenswrapper[4666]: E1203 13:35:31.472426 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="extract-utilities" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.472436 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="extract-utilities" Dec 03 13:35:31 crc kubenswrapper[4666]: E1203 13:35:31.472450 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1277e9ce-d1ea-458a-970b-dd65577bae7f" containerName="collect-profiles" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.472459 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="1277e9ce-d1ea-458a-970b-dd65577bae7f" containerName="collect-profiles" Dec 03 13:35:31 crc kubenswrapper[4666]: E1203 13:35:31.472493 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="registry-server" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.472504 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="registry-server" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.472743 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="e98c3a75-8f23-4662-8d03-cfd673fb8f97" containerName="registry-server" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.472773 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="1277e9ce-d1ea-458a-970b-dd65577bae7f" containerName="collect-profiles" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.478167 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.479212 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-twp6d"] Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.581885 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-utilities\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.582301 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-catalog-content\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.582334 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqcl8\" (UniqueName: \"kubernetes.io/projected/e271b75d-69de-4ed9-b50d-99e74f3c2cff-kube-api-access-vqcl8\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.683914 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-catalog-content\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.683963 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqcl8\" (UniqueName: \"kubernetes.io/projected/e271b75d-69de-4ed9-b50d-99e74f3c2cff-kube-api-access-vqcl8\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.684017 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-utilities\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.684431 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-catalog-content\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.685906 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-utilities\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.704382 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqcl8\" (UniqueName: \"kubernetes.io/projected/e271b75d-69de-4ed9-b50d-99e74f3c2cff-kube-api-access-vqcl8\") pod \"certified-operators-twp6d\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:31 crc kubenswrapper[4666]: I1203 13:35:31.803831 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:32 crc kubenswrapper[4666]: I1203 13:35:32.594761 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-twp6d"] Dec 03 13:35:33 crc kubenswrapper[4666]: I1203 13:35:33.132699 4666 generic.go:334] "Generic (PLEG): container finished" podID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerID="6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c" exitCode=0 Dec 03 13:35:33 crc kubenswrapper[4666]: I1203 13:35:33.132754 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twp6d" event={"ID":"e271b75d-69de-4ed9-b50d-99e74f3c2cff","Type":"ContainerDied","Data":"6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c"} Dec 03 13:35:33 crc kubenswrapper[4666]: I1203 13:35:33.132808 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twp6d" event={"ID":"e271b75d-69de-4ed9-b50d-99e74f3c2cff","Type":"ContainerStarted","Data":"e10320841daac710ae5db643781717ee88d52b2edb109c8cca7af035c361c606"} Dec 03 13:35:33 crc kubenswrapper[4666]: I1203 13:35:33.134744 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:35:35 crc kubenswrapper[4666]: I1203 13:35:35.154405 4666 generic.go:334] "Generic (PLEG): container finished" podID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerID="690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5" exitCode=0 Dec 03 13:35:35 crc kubenswrapper[4666]: I1203 13:35:35.154470 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twp6d" event={"ID":"e271b75d-69de-4ed9-b50d-99e74f3c2cff","Type":"ContainerDied","Data":"690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5"} Dec 03 13:35:36 crc kubenswrapper[4666]: I1203 13:35:36.166179 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twp6d" event={"ID":"e271b75d-69de-4ed9-b50d-99e74f3c2cff","Type":"ContainerStarted","Data":"0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3"} Dec 03 13:35:36 crc kubenswrapper[4666]: I1203 13:35:36.186943 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-twp6d" podStartSLOduration=2.687572962 podStartE2EDuration="5.1869238s" podCreationTimestamp="2025-12-03 13:35:31 +0000 UTC" firstStartedPulling="2025-12-03 13:35:33.134510855 +0000 UTC m=+4921.979471906" lastFinishedPulling="2025-12-03 13:35:35.633861683 +0000 UTC m=+4924.478822744" observedRunningTime="2025-12-03 13:35:36.18319262 +0000 UTC m=+4925.028153671" watchObservedRunningTime="2025-12-03 13:35:36.1869238 +0000 UTC m=+4925.031884851" Dec 03 13:35:41 crc kubenswrapper[4666]: I1203 13:35:41.804420 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:41 crc kubenswrapper[4666]: I1203 13:35:41.805017 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:41 crc kubenswrapper[4666]: I1203 13:35:41.880354 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:42 crc kubenswrapper[4666]: I1203 13:35:42.276224 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:42 crc kubenswrapper[4666]: I1203 13:35:42.323795 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-twp6d"] Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.249116 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-twp6d" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="registry-server" containerID="cri-o://0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3" gracePeriod=2 Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.757844 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.857410 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-utilities\") pod \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.857683 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqcl8\" (UniqueName: \"kubernetes.io/projected/e271b75d-69de-4ed9-b50d-99e74f3c2cff-kube-api-access-vqcl8\") pod \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.857778 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-catalog-content\") pod \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\" (UID: \"e271b75d-69de-4ed9-b50d-99e74f3c2cff\") " Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.858850 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-utilities" (OuterVolumeSpecName: "utilities") pod "e271b75d-69de-4ed9-b50d-99e74f3c2cff" (UID: "e271b75d-69de-4ed9-b50d-99e74f3c2cff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.865688 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e271b75d-69de-4ed9-b50d-99e74f3c2cff-kube-api-access-vqcl8" (OuterVolumeSpecName: "kube-api-access-vqcl8") pod "e271b75d-69de-4ed9-b50d-99e74f3c2cff" (UID: "e271b75d-69de-4ed9-b50d-99e74f3c2cff"). InnerVolumeSpecName "kube-api-access-vqcl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.916752 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e271b75d-69de-4ed9-b50d-99e74f3c2cff" (UID: "e271b75d-69de-4ed9-b50d-99e74f3c2cff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.961264 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqcl8\" (UniqueName: \"kubernetes.io/projected/e271b75d-69de-4ed9-b50d-99e74f3c2cff-kube-api-access-vqcl8\") on node \"crc\" DevicePath \"\"" Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.961312 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:35:44 crc kubenswrapper[4666]: I1203 13:35:44.961332 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e271b75d-69de-4ed9-b50d-99e74f3c2cff-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.263424 4666 generic.go:334] "Generic (PLEG): container finished" podID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerID="0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3" exitCode=0 Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.263503 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-twp6d" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.263525 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twp6d" event={"ID":"e271b75d-69de-4ed9-b50d-99e74f3c2cff","Type":"ContainerDied","Data":"0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3"} Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.264304 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-twp6d" event={"ID":"e271b75d-69de-4ed9-b50d-99e74f3c2cff","Type":"ContainerDied","Data":"e10320841daac710ae5db643781717ee88d52b2edb109c8cca7af035c361c606"} Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.264337 4666 scope.go:117] "RemoveContainer" containerID="0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.298992 4666 scope.go:117] "RemoveContainer" containerID="690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.301867 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-twp6d"] Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.311428 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-twp6d"] Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.331138 4666 scope.go:117] "RemoveContainer" containerID="6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.362878 4666 scope.go:117] "RemoveContainer" containerID="0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3" Dec 03 13:35:45 crc kubenswrapper[4666]: E1203 13:35:45.363385 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3\": container with ID starting with 0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3 not found: ID does not exist" containerID="0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.363420 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3"} err="failed to get container status \"0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3\": rpc error: code = NotFound desc = could not find container \"0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3\": container with ID starting with 0848cd0c1e1a45180c7fd67c1d744e98f4202a0e912ebde523a5827b737fccc3 not found: ID does not exist" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.363441 4666 scope.go:117] "RemoveContainer" containerID="690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5" Dec 03 13:35:45 crc kubenswrapper[4666]: E1203 13:35:45.363868 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5\": container with ID starting with 690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5 not found: ID does not exist" containerID="690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.363920 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5"} err="failed to get container status \"690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5\": rpc error: code = NotFound desc = could not find container \"690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5\": container with ID starting with 690650d70d1d6946503552f0a7fd267bbe6da97cfd9af6ca89e5955b342d54b5 not found: ID does not exist" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.363954 4666 scope.go:117] "RemoveContainer" containerID="6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c" Dec 03 13:35:45 crc kubenswrapper[4666]: E1203 13:35:45.364276 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c\": container with ID starting with 6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c not found: ID does not exist" containerID="6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.364300 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c"} err="failed to get container status \"6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c\": rpc error: code = NotFound desc = could not find container \"6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c\": container with ID starting with 6140ad8d6ba1906df7684b4869f74256ad5569af42b04ea38ca2ad150a646f4c not found: ID does not exist" Dec 03 13:35:45 crc kubenswrapper[4666]: I1203 13:35:45.434196 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" path="/var/lib/kubelet/pods/e271b75d-69de-4ed9-b50d-99e74f3c2cff/volumes" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.491926 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g2gjd"] Dec 03 13:36:33 crc kubenswrapper[4666]: E1203 13:36:33.493035 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="extract-content" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.493052 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="extract-content" Dec 03 13:36:33 crc kubenswrapper[4666]: E1203 13:36:33.493062 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="extract-utilities" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.493068 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="extract-utilities" Dec 03 13:36:33 crc kubenswrapper[4666]: E1203 13:36:33.493114 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="registry-server" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.493121 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="registry-server" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.493358 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="e271b75d-69de-4ed9-b50d-99e74f3c2cff" containerName="registry-server" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.494859 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.515819 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2gjd"] Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.578645 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-catalog-content\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.578725 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-utilities\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.578786 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r4j4\" (UniqueName: \"kubernetes.io/projected/08efdf8c-273e-4acd-a564-640f84ddd594-kube-api-access-8r4j4\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.680262 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-utilities\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.680387 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r4j4\" (UniqueName: \"kubernetes.io/projected/08efdf8c-273e-4acd-a564-640f84ddd594-kube-api-access-8r4j4\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.680483 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-catalog-content\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.680794 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-utilities\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.680901 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-catalog-content\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.723520 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r4j4\" (UniqueName: \"kubernetes.io/projected/08efdf8c-273e-4acd-a564-640f84ddd594-kube-api-access-8r4j4\") pod \"redhat-operators-g2gjd\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:33 crc kubenswrapper[4666]: I1203 13:36:33.832852 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:34 crc kubenswrapper[4666]: I1203 13:36:34.299212 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2gjd"] Dec 03 13:36:34 crc kubenswrapper[4666]: I1203 13:36:34.699042 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerStarted","Data":"d38b8927a6e0ff3f3fc401ba6c56976a3e9bc86738fb6735eeb09850f870eb94"} Dec 03 13:36:37 crc kubenswrapper[4666]: I1203 13:36:37.727660 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerStarted","Data":"38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b"} Dec 03 13:36:38 crc kubenswrapper[4666]: I1203 13:36:38.738206 4666 generic.go:334] "Generic (PLEG): container finished" podID="08efdf8c-273e-4acd-a564-640f84ddd594" containerID="38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b" exitCode=0 Dec 03 13:36:38 crc kubenswrapper[4666]: I1203 13:36:38.738279 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerDied","Data":"38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b"} Dec 03 13:36:40 crc kubenswrapper[4666]: I1203 13:36:40.759512 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerStarted","Data":"12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a"} Dec 03 13:36:41 crc kubenswrapper[4666]: I1203 13:36:41.768847 4666 generic.go:334] "Generic (PLEG): container finished" podID="08efdf8c-273e-4acd-a564-640f84ddd594" containerID="12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a" exitCode=0 Dec 03 13:36:41 crc kubenswrapper[4666]: I1203 13:36:41.769024 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerDied","Data":"12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a"} Dec 03 13:36:42 crc kubenswrapper[4666]: I1203 13:36:42.780006 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerStarted","Data":"c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85"} Dec 03 13:36:42 crc kubenswrapper[4666]: I1203 13:36:42.814732 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g2gjd" podStartSLOduration=6.067606723 podStartE2EDuration="9.814708838s" podCreationTimestamp="2025-12-03 13:36:33 +0000 UTC" firstStartedPulling="2025-12-03 13:36:38.740686442 +0000 UTC m=+4987.585647493" lastFinishedPulling="2025-12-03 13:36:42.487788557 +0000 UTC m=+4991.332749608" observedRunningTime="2025-12-03 13:36:42.80663773 +0000 UTC m=+4991.651598791" watchObservedRunningTime="2025-12-03 13:36:42.814708838 +0000 UTC m=+4991.659669909" Dec 03 13:36:43 crc kubenswrapper[4666]: I1203 13:36:43.832992 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:43 crc kubenswrapper[4666]: I1203 13:36:43.833372 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:44 crc kubenswrapper[4666]: I1203 13:36:44.880158 4666 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g2gjd" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="registry-server" probeResult="failure" output=< Dec 03 13:36:44 crc kubenswrapper[4666]: timeout: failed to connect service ":50051" within 1s Dec 03 13:36:44 crc kubenswrapper[4666]: > Dec 03 13:36:54 crc kubenswrapper[4666]: I1203 13:36:54.326829 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:54 crc kubenswrapper[4666]: I1203 13:36:54.377196 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:54 crc kubenswrapper[4666]: I1203 13:36:54.565220 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2gjd"] Dec 03 13:36:55 crc kubenswrapper[4666]: I1203 13:36:55.901504 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g2gjd" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="registry-server" containerID="cri-o://c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85" gracePeriod=2 Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.355759 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.470423 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r4j4\" (UniqueName: \"kubernetes.io/projected/08efdf8c-273e-4acd-a564-640f84ddd594-kube-api-access-8r4j4\") pod \"08efdf8c-273e-4acd-a564-640f84ddd594\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.470885 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-utilities\") pod \"08efdf8c-273e-4acd-a564-640f84ddd594\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.470935 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-catalog-content\") pod \"08efdf8c-273e-4acd-a564-640f84ddd594\" (UID: \"08efdf8c-273e-4acd-a564-640f84ddd594\") " Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.472488 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-utilities" (OuterVolumeSpecName: "utilities") pod "08efdf8c-273e-4acd-a564-640f84ddd594" (UID: "08efdf8c-273e-4acd-a564-640f84ddd594"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.479586 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08efdf8c-273e-4acd-a564-640f84ddd594-kube-api-access-8r4j4" (OuterVolumeSpecName: "kube-api-access-8r4j4") pod "08efdf8c-273e-4acd-a564-640f84ddd594" (UID: "08efdf8c-273e-4acd-a564-640f84ddd594"). InnerVolumeSpecName "kube-api-access-8r4j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.573461 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r4j4\" (UniqueName: \"kubernetes.io/projected/08efdf8c-273e-4acd-a564-640f84ddd594-kube-api-access-8r4j4\") on node \"crc\" DevicePath \"\"" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.573784 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.596815 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08efdf8c-273e-4acd-a564-640f84ddd594" (UID: "08efdf8c-273e-4acd-a564-640f84ddd594"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.675398 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08efdf8c-273e-4acd-a564-640f84ddd594-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.911332 4666 generic.go:334] "Generic (PLEG): container finished" podID="08efdf8c-273e-4acd-a564-640f84ddd594" containerID="c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85" exitCode=0 Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.911374 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerDied","Data":"c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85"} Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.911422 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2gjd" event={"ID":"08efdf8c-273e-4acd-a564-640f84ddd594","Type":"ContainerDied","Data":"d38b8927a6e0ff3f3fc401ba6c56976a3e9bc86738fb6735eeb09850f870eb94"} Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.911443 4666 scope.go:117] "RemoveContainer" containerID="c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.912674 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2gjd" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.948268 4666 scope.go:117] "RemoveContainer" containerID="12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a" Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.957360 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2gjd"] Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.964834 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g2gjd"] Dec 03 13:36:56 crc kubenswrapper[4666]: I1203 13:36:56.969833 4666 scope.go:117] "RemoveContainer" containerID="38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b" Dec 03 13:36:57 crc kubenswrapper[4666]: I1203 13:36:57.013150 4666 scope.go:117] "RemoveContainer" containerID="c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85" Dec 03 13:36:57 crc kubenswrapper[4666]: E1203 13:36:57.013864 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85\": container with ID starting with c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85 not found: ID does not exist" containerID="c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85" Dec 03 13:36:57 crc kubenswrapper[4666]: I1203 13:36:57.013912 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85"} err="failed to get container status \"c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85\": rpc error: code = NotFound desc = could not find container \"c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85\": container with ID starting with c216c6783dd379e6641c6d3176a638f3dd2ce0ad186fa16773382de6a0fbeb85 not found: ID does not exist" Dec 03 13:36:57 crc kubenswrapper[4666]: I1203 13:36:57.013938 4666 scope.go:117] "RemoveContainer" containerID="12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a" Dec 03 13:36:57 crc kubenswrapper[4666]: E1203 13:36:57.014429 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a\": container with ID starting with 12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a not found: ID does not exist" containerID="12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a" Dec 03 13:36:57 crc kubenswrapper[4666]: I1203 13:36:57.014486 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a"} err="failed to get container status \"12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a\": rpc error: code = NotFound desc = could not find container \"12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a\": container with ID starting with 12b629bff1d73a371650d564b75d00e4f3d752cf0a3b1dd7ebfcc8f612571f1a not found: ID does not exist" Dec 03 13:36:57 crc kubenswrapper[4666]: I1203 13:36:57.014520 4666 scope.go:117] "RemoveContainer" containerID="38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b" Dec 03 13:36:57 crc kubenswrapper[4666]: E1203 13:36:57.015039 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b\": container with ID starting with 38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b not found: ID does not exist" containerID="38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b" Dec 03 13:36:57 crc kubenswrapper[4666]: I1203 13:36:57.015068 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b"} err="failed to get container status \"38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b\": rpc error: code = NotFound desc = could not find container \"38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b\": container with ID starting with 38a4076609b4b89692ad63b60931dbf0a30772c6a622bc9a906f368ea0055d0b not found: ID does not exist" Dec 03 13:36:57 crc kubenswrapper[4666]: I1203 13:36:57.465057 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" path="/var/lib/kubelet/pods/08efdf8c-273e-4acd-a564-640f84ddd594/volumes" Dec 03 13:37:09 crc kubenswrapper[4666]: I1203 13:37:09.866636 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:37:09 crc kubenswrapper[4666]: I1203 13:37:09.867252 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:37:39 crc kubenswrapper[4666]: I1203 13:37:39.866134 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:37:39 crc kubenswrapper[4666]: I1203 13:37:39.866794 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:38:09 crc kubenswrapper[4666]: I1203 13:38:09.865583 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:38:09 crc kubenswrapper[4666]: I1203 13:38:09.866148 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:38:09 crc kubenswrapper[4666]: I1203 13:38:09.866190 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:38:09 crc kubenswrapper[4666]: I1203 13:38:09.866916 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a464ae0a9428614165c6fb33b80d1aafcc58c94ae1f6920f040bcb85b55acff"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:38:09 crc kubenswrapper[4666]: I1203 13:38:09.866975 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://6a464ae0a9428614165c6fb33b80d1aafcc58c94ae1f6920f040bcb85b55acff" gracePeriod=600 Dec 03 13:38:10 crc kubenswrapper[4666]: I1203 13:38:10.627248 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="6a464ae0a9428614165c6fb33b80d1aafcc58c94ae1f6920f040bcb85b55acff" exitCode=0 Dec 03 13:38:10 crc kubenswrapper[4666]: I1203 13:38:10.627310 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"6a464ae0a9428614165c6fb33b80d1aafcc58c94ae1f6920f040bcb85b55acff"} Dec 03 13:38:10 crc kubenswrapper[4666]: I1203 13:38:10.628049 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c"} Dec 03 13:38:10 crc kubenswrapper[4666]: I1203 13:38:10.628076 4666 scope.go:117] "RemoveContainer" containerID="d43c63bff4fe7ca6a20d5bf85c1b314302961a661f9b52b59592dc13dfb567ad" Dec 03 13:38:23 crc kubenswrapper[4666]: I1203 13:38:23.788910 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="8e9fa52f-aa3b-4705-8a71-e48befc92571" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 03 13:38:51 crc kubenswrapper[4666]: I1203 13:38:51.289384 4666 generic.go:334] "Generic (PLEG): container finished" podID="33fe3655-5d2c-48bd-8a4b-f436570d149c" containerID="207e1f4e81e314f28fecc8ef2308a893c13f2e05c03701ace0025274dbe29575" exitCode=0 Dec 03 13:38:51 crc kubenswrapper[4666]: I1203 13:38:51.289480 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33fe3655-5d2c-48bd-8a4b-f436570d149c","Type":"ContainerDied","Data":"207e1f4e81e314f28fecc8ef2308a893c13f2e05c03701ace0025274dbe29575"} Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.699930 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.806475 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.806706 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ssh-key\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.807488 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config-secret\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.807554 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-temporary\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.807623 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ca-certs\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.807677 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.808004 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-workdir\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.808062 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67prk\" (UniqueName: \"kubernetes.io/projected/33fe3655-5d2c-48bd-8a4b-f436570d149c-kube-api-access-67prk\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.808101 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-config-data\") pod \"33fe3655-5d2c-48bd-8a4b-f436570d149c\" (UID: \"33fe3655-5d2c-48bd-8a4b-f436570d149c\") " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.808557 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.809328 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-config-data" (OuterVolumeSpecName: "config-data") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.813909 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.813997 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.814300 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33fe3655-5d2c-48bd-8a4b-f436570d149c-kube-api-access-67prk" (OuterVolumeSpecName: "kube-api-access-67prk") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "kube-api-access-67prk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.836240 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.836842 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.852429 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.858339 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "33fe3655-5d2c-48bd-8a4b-f436570d149c" (UID: "33fe3655-5d2c-48bd-8a4b-f436570d149c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.911241 4666 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912645 4666 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912697 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912723 4666 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912741 4666 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33fe3655-5d2c-48bd-8a4b-f436570d149c-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912763 4666 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912782 4666 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33fe3655-5d2c-48bd-8a4b-f436570d149c-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912800 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67prk\" (UniqueName: \"kubernetes.io/projected/33fe3655-5d2c-48bd-8a4b-f436570d149c-kube-api-access-67prk\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.912821 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33fe3655-5d2c-48bd-8a4b-f436570d149c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:52 crc kubenswrapper[4666]: I1203 13:38:52.933950 4666 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 13:38:53 crc kubenswrapper[4666]: I1203 13:38:53.014956 4666 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 13:38:53 crc kubenswrapper[4666]: I1203 13:38:53.312803 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33fe3655-5d2c-48bd-8a4b-f436570d149c","Type":"ContainerDied","Data":"be46491ee4efbcbbe7c67dfb3c82bc715016c0b5b2909ff57c9ed34cf314da82"} Dec 03 13:38:53 crc kubenswrapper[4666]: I1203 13:38:53.312851 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be46491ee4efbcbbe7c67dfb3c82bc715016c0b5b2909ff57c9ed34cf314da82" Dec 03 13:38:53 crc kubenswrapper[4666]: I1203 13:38:53.312929 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.015890 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 13:39:00 crc kubenswrapper[4666]: E1203 13:39:00.017164 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="extract-utilities" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.017184 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="extract-utilities" Dec 03 13:39:00 crc kubenswrapper[4666]: E1203 13:39:00.017199 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33fe3655-5d2c-48bd-8a4b-f436570d149c" containerName="tempest-tests-tempest-tests-runner" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.017208 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="33fe3655-5d2c-48bd-8a4b-f436570d149c" containerName="tempest-tests-tempest-tests-runner" Dec 03 13:39:00 crc kubenswrapper[4666]: E1203 13:39:00.017228 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="extract-content" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.017235 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="extract-content" Dec 03 13:39:00 crc kubenswrapper[4666]: E1203 13:39:00.017255 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="registry-server" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.017263 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="registry-server" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.017514 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="33fe3655-5d2c-48bd-8a4b-f436570d149c" containerName="tempest-tests-tempest-tests-runner" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.017563 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="08efdf8c-273e-4acd-a564-640f84ddd594" containerName="registry-server" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.018610 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.025323 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bg7t9" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.026549 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.177940 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.178427 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5c9\" (UniqueName: \"kubernetes.io/projected/57ab82ad-c603-403b-8a3b-9e32e2ffc1ea-kube-api-access-5x5c9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.280125 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.280208 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x5c9\" (UniqueName: \"kubernetes.io/projected/57ab82ad-c603-403b-8a3b-9e32e2ffc1ea-kube-api-access-5x5c9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.280884 4666 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.308538 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x5c9\" (UniqueName: \"kubernetes.io/projected/57ab82ad-c603-403b-8a3b-9e32e2ffc1ea-kube-api-access-5x5c9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.308896 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.360783 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 13:39:00 crc kubenswrapper[4666]: I1203 13:39:00.868488 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 13:39:01 crc kubenswrapper[4666]: I1203 13:39:01.437731 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea","Type":"ContainerStarted","Data":"a9d54d9f112838312e23adf2b250e316d17866b3e793266297b99fd2c04a91fd"} Dec 03 13:39:03 crc kubenswrapper[4666]: I1203 13:39:03.479616 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"57ab82ad-c603-403b-8a3b-9e32e2ffc1ea","Type":"ContainerStarted","Data":"6d8ffe68b7d44df4e8a522dfaab5cbef501d9bda80943c38e3281cbc4b9629ec"} Dec 03 13:39:03 crc kubenswrapper[4666]: I1203 13:39:03.505111 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.740597518 podStartE2EDuration="4.505074196s" podCreationTimestamp="2025-12-03 13:38:59 +0000 UTC" firstStartedPulling="2025-12-03 13:39:00.871960748 +0000 UTC m=+5129.716921799" lastFinishedPulling="2025-12-03 13:39:01.636437426 +0000 UTC m=+5130.481398477" observedRunningTime="2025-12-03 13:39:03.496347541 +0000 UTC m=+5132.341308612" watchObservedRunningTime="2025-12-03 13:39:03.505074196 +0000 UTC m=+5132.350035247" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.287345 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vpr69"] Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.290018 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.302901 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vpr69"] Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.469450 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-kube-api-access-bmpqt\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.469842 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-catalog-content\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.469960 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-utilities\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.572529 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-catalog-content\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.572875 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-utilities\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.573044 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-catalog-content\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.573317 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-kube-api-access-bmpqt\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.573422 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-utilities\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.597613 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-kube-api-access-bmpqt\") pod \"community-operators-vpr69\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:21 crc kubenswrapper[4666]: I1203 13:39:21.611156 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:22 crc kubenswrapper[4666]: I1203 13:39:22.237509 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vpr69"] Dec 03 13:39:22 crc kubenswrapper[4666]: W1203 13:39:22.239726 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42bbcb70_0b56_4ca4_8dbc_1e4e327cc1a3.slice/crio-b5fbc8e6d16bfe3a980e778bc7e0f63630047be637a7f62d45d5e64c3ef8c595 WatchSource:0}: Error finding container b5fbc8e6d16bfe3a980e778bc7e0f63630047be637a7f62d45d5e64c3ef8c595: Status 404 returned error can't find the container with id b5fbc8e6d16bfe3a980e778bc7e0f63630047be637a7f62d45d5e64c3ef8c595 Dec 03 13:39:22 crc kubenswrapper[4666]: I1203 13:39:22.714367 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpr69" event={"ID":"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3","Type":"ContainerStarted","Data":"b5fbc8e6d16bfe3a980e778bc7e0f63630047be637a7f62d45d5e64c3ef8c595"} Dec 03 13:39:23 crc kubenswrapper[4666]: I1203 13:39:23.724949 4666 generic.go:334] "Generic (PLEG): container finished" podID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerID="fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041" exitCode=0 Dec 03 13:39:23 crc kubenswrapper[4666]: I1203 13:39:23.725485 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpr69" event={"ID":"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3","Type":"ContainerDied","Data":"fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041"} Dec 03 13:39:24 crc kubenswrapper[4666]: I1203 13:39:24.740233 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpr69" event={"ID":"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3","Type":"ContainerStarted","Data":"bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515"} Dec 03 13:39:25 crc kubenswrapper[4666]: I1203 13:39:25.751921 4666 generic.go:334] "Generic (PLEG): container finished" podID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerID="bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515" exitCode=0 Dec 03 13:39:25 crc kubenswrapper[4666]: I1203 13:39:25.751978 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpr69" event={"ID":"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3","Type":"ContainerDied","Data":"bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515"} Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.201756 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5d8dq/must-gather-w2l7k"] Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.204385 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.209286 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5d8dq"/"default-dockercfg-rnbn9" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.209313 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5d8dq"/"kube-root-ca.crt" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.210770 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5d8dq/must-gather-w2l7k"] Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.220059 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5d8dq"/"openshift-service-ca.crt" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.399318 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6fff386-9ad2-4f14-9c07-772e90f24102-must-gather-output\") pod \"must-gather-w2l7k\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.399492 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsgx2\" (UniqueName: \"kubernetes.io/projected/d6fff386-9ad2-4f14-9c07-772e90f24102-kube-api-access-fsgx2\") pod \"must-gather-w2l7k\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.500689 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6fff386-9ad2-4f14-9c07-772e90f24102-must-gather-output\") pod \"must-gather-w2l7k\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.500845 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgx2\" (UniqueName: \"kubernetes.io/projected/d6fff386-9ad2-4f14-9c07-772e90f24102-kube-api-access-fsgx2\") pod \"must-gather-w2l7k\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.501170 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6fff386-9ad2-4f14-9c07-772e90f24102-must-gather-output\") pod \"must-gather-w2l7k\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.519389 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsgx2\" (UniqueName: \"kubernetes.io/projected/d6fff386-9ad2-4f14-9c07-772e90f24102-kube-api-access-fsgx2\") pod \"must-gather-w2l7k\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.524719 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.798543 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpr69" event={"ID":"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3","Type":"ContainerStarted","Data":"eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257"} Dec 03 13:39:27 crc kubenswrapper[4666]: I1203 13:39:27.852199 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vpr69" podStartSLOduration=4.31908825 podStartE2EDuration="6.852180746s" podCreationTimestamp="2025-12-03 13:39:21 +0000 UTC" firstStartedPulling="2025-12-03 13:39:23.728034231 +0000 UTC m=+5152.572995272" lastFinishedPulling="2025-12-03 13:39:26.261126717 +0000 UTC m=+5155.106087768" observedRunningTime="2025-12-03 13:39:27.845634249 +0000 UTC m=+5156.690595300" watchObservedRunningTime="2025-12-03 13:39:27.852180746 +0000 UTC m=+5156.697141797" Dec 03 13:39:28 crc kubenswrapper[4666]: I1203 13:39:28.129186 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5d8dq/must-gather-w2l7k"] Dec 03 13:39:28 crc kubenswrapper[4666]: W1203 13:39:28.492368 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6fff386_9ad2_4f14_9c07_772e90f24102.slice/crio-4ac1c12ef248df922874e3aff4b12929aa9324be8bc375cec483bf6cad744efe WatchSource:0}: Error finding container 4ac1c12ef248df922874e3aff4b12929aa9324be8bc375cec483bf6cad744efe: Status 404 returned error can't find the container with id 4ac1c12ef248df922874e3aff4b12929aa9324be8bc375cec483bf6cad744efe Dec 03 13:39:28 crc kubenswrapper[4666]: I1203 13:39:28.821733 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" event={"ID":"d6fff386-9ad2-4f14-9c07-772e90f24102","Type":"ContainerStarted","Data":"4ac1c12ef248df922874e3aff4b12929aa9324be8bc375cec483bf6cad744efe"} Dec 03 13:39:31 crc kubenswrapper[4666]: I1203 13:39:31.611321 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:31 crc kubenswrapper[4666]: I1203 13:39:31.624045 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:31 crc kubenswrapper[4666]: I1203 13:39:31.668515 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:31 crc kubenswrapper[4666]: I1203 13:39:31.899482 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:31 crc kubenswrapper[4666]: I1203 13:39:31.942948 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vpr69"] Dec 03 13:39:33 crc kubenswrapper[4666]: I1203 13:39:33.869313 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vpr69" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="registry-server" containerID="cri-o://eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257" gracePeriod=2 Dec 03 13:39:33 crc kubenswrapper[4666]: I1203 13:39:33.869436 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" event={"ID":"d6fff386-9ad2-4f14-9c07-772e90f24102","Type":"ContainerStarted","Data":"639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05"} Dec 03 13:39:33 crc kubenswrapper[4666]: I1203 13:39:33.869938 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" event={"ID":"d6fff386-9ad2-4f14-9c07-772e90f24102","Type":"ContainerStarted","Data":"c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c"} Dec 03 13:39:33 crc kubenswrapper[4666]: I1203 13:39:33.902516 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" podStartSLOduration=2.414797248 podStartE2EDuration="6.902490939s" podCreationTimestamp="2025-12-03 13:39:27 +0000 UTC" firstStartedPulling="2025-12-03 13:39:28.494932407 +0000 UTC m=+5157.339893458" lastFinishedPulling="2025-12-03 13:39:32.982626098 +0000 UTC m=+5161.827587149" observedRunningTime="2025-12-03 13:39:33.893410524 +0000 UTC m=+5162.738371565" watchObservedRunningTime="2025-12-03 13:39:33.902490939 +0000 UTC m=+5162.747451990" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.809781 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.861274 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-kube-api-access-bmpqt\") pod \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.861364 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-utilities\") pod \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.861391 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-catalog-content\") pod \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\" (UID: \"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3\") " Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.868689 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-kube-api-access-bmpqt" (OuterVolumeSpecName: "kube-api-access-bmpqt") pod "42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" (UID: "42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3"). InnerVolumeSpecName "kube-api-access-bmpqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.869602 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-utilities" (OuterVolumeSpecName: "utilities") pod "42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" (UID: "42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.880904 4666 generic.go:334] "Generic (PLEG): container finished" podID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerID="eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257" exitCode=0 Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.881028 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpr69" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.881059 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpr69" event={"ID":"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3","Type":"ContainerDied","Data":"eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257"} Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.881192 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpr69" event={"ID":"42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3","Type":"ContainerDied","Data":"b5fbc8e6d16bfe3a980e778bc7e0f63630047be637a7f62d45d5e64c3ef8c595"} Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.881216 4666 scope.go:117] "RemoveContainer" containerID="eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.923760 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" (UID: "42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.930987 4666 scope.go:117] "RemoveContainer" containerID="bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.952818 4666 scope.go:117] "RemoveContainer" containerID="fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.963314 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-kube-api-access-bmpqt\") on node \"crc\" DevicePath \"\"" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.963343 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:39:34 crc kubenswrapper[4666]: I1203 13:39:34.963352 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.029131 4666 scope.go:117] "RemoveContainer" containerID="eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257" Dec 03 13:39:35 crc kubenswrapper[4666]: E1203 13:39:35.029560 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257\": container with ID starting with eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257 not found: ID does not exist" containerID="eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257" Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.029609 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257"} err="failed to get container status \"eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257\": rpc error: code = NotFound desc = could not find container \"eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257\": container with ID starting with eec06a977cc210c79e47e62591227e3287429fcdd0e086cea8e6d985d1ca7257 not found: ID does not exist" Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.029639 4666 scope.go:117] "RemoveContainer" containerID="bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515" Dec 03 13:39:35 crc kubenswrapper[4666]: E1203 13:39:35.029941 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515\": container with ID starting with bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515 not found: ID does not exist" containerID="bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515" Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.029983 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515"} err="failed to get container status \"bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515\": rpc error: code = NotFound desc = could not find container \"bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515\": container with ID starting with bb7135a40eaa03a2f5aa9a39c3da2f623d570fc0f6d2592a0ece325730770515 not found: ID does not exist" Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.030012 4666 scope.go:117] "RemoveContainer" containerID="fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041" Dec 03 13:39:35 crc kubenswrapper[4666]: E1203 13:39:35.030335 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041\": container with ID starting with fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041 not found: ID does not exist" containerID="fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041" Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.030364 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041"} err="failed to get container status \"fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041\": rpc error: code = NotFound desc = could not find container \"fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041\": container with ID starting with fd25d71bf7c815e838e4467c4d57c8c3218eeb439dba87b36dbcadb6c15e0041 not found: ID does not exist" Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.214012 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vpr69"] Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.223992 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vpr69"] Dec 03 13:39:35 crc kubenswrapper[4666]: I1203 13:39:35.433685 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" path="/var/lib/kubelet/pods/42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3/volumes" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.264461 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-b9lkt"] Dec 03 13:39:37 crc kubenswrapper[4666]: E1203 13:39:37.265525 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="registry-server" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.265541 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="registry-server" Dec 03 13:39:37 crc kubenswrapper[4666]: E1203 13:39:37.265554 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="extract-utilities" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.265560 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="extract-utilities" Dec 03 13:39:37 crc kubenswrapper[4666]: E1203 13:39:37.265574 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="extract-content" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.265581 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="extract-content" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.265805 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="42bbcb70-0b56-4ca4-8dbc-1e4e327cc1a3" containerName="registry-server" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.266533 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.310559 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz8wj\" (UniqueName: \"kubernetes.io/projected/99a2905f-f869-4606-9c4e-6b487c370528-kube-api-access-wz8wj\") pod \"crc-debug-b9lkt\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.310717 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a2905f-f869-4606-9c4e-6b487c370528-host\") pod \"crc-debug-b9lkt\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.413075 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a2905f-f869-4606-9c4e-6b487c370528-host\") pod \"crc-debug-b9lkt\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.413196 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a2905f-f869-4606-9c4e-6b487c370528-host\") pod \"crc-debug-b9lkt\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.413402 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz8wj\" (UniqueName: \"kubernetes.io/projected/99a2905f-f869-4606-9c4e-6b487c370528-kube-api-access-wz8wj\") pod \"crc-debug-b9lkt\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.688051 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz8wj\" (UniqueName: \"kubernetes.io/projected/99a2905f-f869-4606-9c4e-6b487c370528-kube-api-access-wz8wj\") pod \"crc-debug-b9lkt\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: I1203 13:39:37.886217 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:39:37 crc kubenswrapper[4666]: W1203 13:39:37.916465 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99a2905f_f869_4606_9c4e_6b487c370528.slice/crio-21ce99cc92a1df1690e5dc471842eebfef5d776076898daf145caa51e2699a5e WatchSource:0}: Error finding container 21ce99cc92a1df1690e5dc471842eebfef5d776076898daf145caa51e2699a5e: Status 404 returned error can't find the container with id 21ce99cc92a1df1690e5dc471842eebfef5d776076898daf145caa51e2699a5e Dec 03 13:39:38 crc kubenswrapper[4666]: I1203 13:39:38.919031 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" event={"ID":"99a2905f-f869-4606-9c4e-6b487c370528","Type":"ContainerStarted","Data":"21ce99cc92a1df1690e5dc471842eebfef5d776076898daf145caa51e2699a5e"} Dec 03 13:39:52 crc kubenswrapper[4666]: I1203 13:39:52.033694 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" event={"ID":"99a2905f-f869-4606-9c4e-6b487c370528","Type":"ContainerStarted","Data":"39773b043e319b66536d052b352275f429a53be12ec99508c08ed24e1a8ecc1b"} Dec 03 13:39:52 crc kubenswrapper[4666]: I1203 13:39:52.052204 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" podStartSLOduration=1.879618784 podStartE2EDuration="15.052176736s" podCreationTimestamp="2025-12-03 13:39:37 +0000 UTC" firstStartedPulling="2025-12-03 13:39:37.918550854 +0000 UTC m=+5166.763511905" lastFinishedPulling="2025-12-03 13:39:51.091108806 +0000 UTC m=+5179.936069857" observedRunningTime="2025-12-03 13:39:52.046564925 +0000 UTC m=+5180.891525986" watchObservedRunningTime="2025-12-03 13:39:52.052176736 +0000 UTC m=+5180.897137787" Dec 03 13:40:39 crc kubenswrapper[4666]: I1203 13:40:39.866799 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:40:39 crc kubenswrapper[4666]: I1203 13:40:39.867436 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:40:48 crc kubenswrapper[4666]: I1203 13:40:48.624576 4666 generic.go:334] "Generic (PLEG): container finished" podID="99a2905f-f869-4606-9c4e-6b487c370528" containerID="39773b043e319b66536d052b352275f429a53be12ec99508c08ed24e1a8ecc1b" exitCode=0 Dec 03 13:40:48 crc kubenswrapper[4666]: I1203 13:40:48.624649 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" event={"ID":"99a2905f-f869-4606-9c4e-6b487c370528","Type":"ContainerDied","Data":"39773b043e319b66536d052b352275f429a53be12ec99508c08ed24e1a8ecc1b"} Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.772816 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.827154 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-b9lkt"] Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.838756 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-b9lkt"] Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.915705 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a2905f-f869-4606-9c4e-6b487c370528-host\") pod \"99a2905f-f869-4606-9c4e-6b487c370528\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.915709 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99a2905f-f869-4606-9c4e-6b487c370528-host" (OuterVolumeSpecName: "host") pod "99a2905f-f869-4606-9c4e-6b487c370528" (UID: "99a2905f-f869-4606-9c4e-6b487c370528"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.916012 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz8wj\" (UniqueName: \"kubernetes.io/projected/99a2905f-f869-4606-9c4e-6b487c370528-kube-api-access-wz8wj\") pod \"99a2905f-f869-4606-9c4e-6b487c370528\" (UID: \"99a2905f-f869-4606-9c4e-6b487c370528\") " Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.916842 4666 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a2905f-f869-4606-9c4e-6b487c370528-host\") on node \"crc\" DevicePath \"\"" Dec 03 13:40:49 crc kubenswrapper[4666]: I1203 13:40:49.924756 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a2905f-f869-4606-9c4e-6b487c370528-kube-api-access-wz8wj" (OuterVolumeSpecName: "kube-api-access-wz8wj") pod "99a2905f-f869-4606-9c4e-6b487c370528" (UID: "99a2905f-f869-4606-9c4e-6b487c370528"). InnerVolumeSpecName "kube-api-access-wz8wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:40:50 crc kubenswrapper[4666]: I1203 13:40:50.018863 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz8wj\" (UniqueName: \"kubernetes.io/projected/99a2905f-f869-4606-9c4e-6b487c370528-kube-api-access-wz8wj\") on node \"crc\" DevicePath \"\"" Dec 03 13:40:50 crc kubenswrapper[4666]: I1203 13:40:50.657521 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ce99cc92a1df1690e5dc471842eebfef5d776076898daf145caa51e2699a5e" Dec 03 13:40:50 crc kubenswrapper[4666]: I1203 13:40:50.657626 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-b9lkt" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.089063 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-cfbwj"] Dec 03 13:40:51 crc kubenswrapper[4666]: E1203 13:40:51.089746 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a2905f-f869-4606-9c4e-6b487c370528" containerName="container-00" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.089760 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a2905f-f869-4606-9c4e-6b487c370528" containerName="container-00" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.089968 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a2905f-f869-4606-9c4e-6b487c370528" containerName="container-00" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.090551 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.140539 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a9918d-34f1-4c12-8194-784074a9c5f5-host\") pod \"crc-debug-cfbwj\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.140592 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77ddq\" (UniqueName: \"kubernetes.io/projected/90a9918d-34f1-4c12-8194-784074a9c5f5-kube-api-access-77ddq\") pod \"crc-debug-cfbwj\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.241763 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a9918d-34f1-4c12-8194-784074a9c5f5-host\") pod \"crc-debug-cfbwj\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.241814 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77ddq\" (UniqueName: \"kubernetes.io/projected/90a9918d-34f1-4c12-8194-784074a9c5f5-kube-api-access-77ddq\") pod \"crc-debug-cfbwj\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.241905 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a9918d-34f1-4c12-8194-784074a9c5f5-host\") pod \"crc-debug-cfbwj\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.261041 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77ddq\" (UniqueName: \"kubernetes.io/projected/90a9918d-34f1-4c12-8194-784074a9c5f5-kube-api-access-77ddq\") pod \"crc-debug-cfbwj\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.409337 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.463724 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a2905f-f869-4606-9c4e-6b487c370528" path="/var/lib/kubelet/pods/99a2905f-f869-4606-9c4e-6b487c370528/volumes" Dec 03 13:40:51 crc kubenswrapper[4666]: I1203 13:40:51.670758 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" event={"ID":"90a9918d-34f1-4c12-8194-784074a9c5f5","Type":"ContainerStarted","Data":"dee45a8be27e82fa0a7b573e273ac37c0c47d70fcabe66e9289bfd6258bd30bf"} Dec 03 13:40:52 crc kubenswrapper[4666]: I1203 13:40:52.680551 4666 generic.go:334] "Generic (PLEG): container finished" podID="90a9918d-34f1-4c12-8194-784074a9c5f5" containerID="7b3ddb084a6f78a43d6ab9268f4548253cd2a7b7a0348721a15e30d762270760" exitCode=0 Dec 03 13:40:52 crc kubenswrapper[4666]: I1203 13:40:52.680602 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" event={"ID":"90a9918d-34f1-4c12-8194-784074a9c5f5","Type":"ContainerDied","Data":"7b3ddb084a6f78a43d6ab9268f4548253cd2a7b7a0348721a15e30d762270760"} Dec 03 13:40:53 crc kubenswrapper[4666]: I1203 13:40:53.770246 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:53 crc kubenswrapper[4666]: I1203 13:40:53.786295 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a9918d-34f1-4c12-8194-784074a9c5f5-host\") pod \"90a9918d-34f1-4c12-8194-784074a9c5f5\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " Dec 03 13:40:53 crc kubenswrapper[4666]: I1203 13:40:53.786371 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90a9918d-34f1-4c12-8194-784074a9c5f5-host" (OuterVolumeSpecName: "host") pod "90a9918d-34f1-4c12-8194-784074a9c5f5" (UID: "90a9918d-34f1-4c12-8194-784074a9c5f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:40:53 crc kubenswrapper[4666]: I1203 13:40:53.787493 4666 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a9918d-34f1-4c12-8194-784074a9c5f5-host\") on node \"crc\" DevicePath \"\"" Dec 03 13:40:53 crc kubenswrapper[4666]: I1203 13:40:53.888025 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77ddq\" (UniqueName: \"kubernetes.io/projected/90a9918d-34f1-4c12-8194-784074a9c5f5-kube-api-access-77ddq\") pod \"90a9918d-34f1-4c12-8194-784074a9c5f5\" (UID: \"90a9918d-34f1-4c12-8194-784074a9c5f5\") " Dec 03 13:40:53 crc kubenswrapper[4666]: I1203 13:40:53.892698 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a9918d-34f1-4c12-8194-784074a9c5f5-kube-api-access-77ddq" (OuterVolumeSpecName: "kube-api-access-77ddq") pod "90a9918d-34f1-4c12-8194-784074a9c5f5" (UID: "90a9918d-34f1-4c12-8194-784074a9c5f5"). InnerVolumeSpecName "kube-api-access-77ddq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:40:53 crc kubenswrapper[4666]: I1203 13:40:53.990327 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77ddq\" (UniqueName: \"kubernetes.io/projected/90a9918d-34f1-4c12-8194-784074a9c5f5-kube-api-access-77ddq\") on node \"crc\" DevicePath \"\"" Dec 03 13:40:54 crc kubenswrapper[4666]: I1203 13:40:54.660070 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-cfbwj"] Dec 03 13:40:54 crc kubenswrapper[4666]: I1203 13:40:54.667958 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-cfbwj"] Dec 03 13:40:54 crc kubenswrapper[4666]: I1203 13:40:54.696173 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dee45a8be27e82fa0a7b573e273ac37c0c47d70fcabe66e9289bfd6258bd30bf" Dec 03 13:40:54 crc kubenswrapper[4666]: I1203 13:40:54.696237 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-cfbwj" Dec 03 13:40:55 crc kubenswrapper[4666]: I1203 13:40:55.436467 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a9918d-34f1-4c12-8194-784074a9c5f5" path="/var/lib/kubelet/pods/90a9918d-34f1-4c12-8194-784074a9c5f5/volumes" Dec 03 13:40:55 crc kubenswrapper[4666]: I1203 13:40:55.851358 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-6769k"] Dec 03 13:40:55 crc kubenswrapper[4666]: E1203 13:40:55.851798 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a9918d-34f1-4c12-8194-784074a9c5f5" containerName="container-00" Dec 03 13:40:55 crc kubenswrapper[4666]: I1203 13:40:55.851819 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a9918d-34f1-4c12-8194-784074a9c5f5" containerName="container-00" Dec 03 13:40:55 crc kubenswrapper[4666]: I1203 13:40:55.852028 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a9918d-34f1-4c12-8194-784074a9c5f5" containerName="container-00" Dec 03 13:40:55 crc kubenswrapper[4666]: I1203 13:40:55.852678 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.036797 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ccn\" (UniqueName: \"kubernetes.io/projected/35c8a9d2-8856-458c-b92d-5c523f98a947-kube-api-access-z7ccn\") pod \"crc-debug-6769k\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.037146 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35c8a9d2-8856-458c-b92d-5c523f98a947-host\") pod \"crc-debug-6769k\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.138847 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35c8a9d2-8856-458c-b92d-5c523f98a947-host\") pod \"crc-debug-6769k\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.138934 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ccn\" (UniqueName: \"kubernetes.io/projected/35c8a9d2-8856-458c-b92d-5c523f98a947-kube-api-access-z7ccn\") pod \"crc-debug-6769k\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.138980 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35c8a9d2-8856-458c-b92d-5c523f98a947-host\") pod \"crc-debug-6769k\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.160651 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ccn\" (UniqueName: \"kubernetes.io/projected/35c8a9d2-8856-458c-b92d-5c523f98a947-kube-api-access-z7ccn\") pod \"crc-debug-6769k\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.173938 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:56 crc kubenswrapper[4666]: W1203 13:40:56.201581 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35c8a9d2_8856_458c_b92d_5c523f98a947.slice/crio-2190d9d47b5b04e596287aec2d9c59ccb633747c3e40d72f40d96b7fd89bd1c8 WatchSource:0}: Error finding container 2190d9d47b5b04e596287aec2d9c59ccb633747c3e40d72f40d96b7fd89bd1c8: Status 404 returned error can't find the container with id 2190d9d47b5b04e596287aec2d9c59ccb633747c3e40d72f40d96b7fd89bd1c8 Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.715808 4666 generic.go:334] "Generic (PLEG): container finished" podID="35c8a9d2-8856-458c-b92d-5c523f98a947" containerID="b11fb8a51afb79a7c7721736553a328eda181c748cd74f5ab2e493b13ab3a1da" exitCode=0 Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.715884 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/crc-debug-6769k" event={"ID":"35c8a9d2-8856-458c-b92d-5c523f98a947","Type":"ContainerDied","Data":"b11fb8a51afb79a7c7721736553a328eda181c748cd74f5ab2e493b13ab3a1da"} Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.716234 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/crc-debug-6769k" event={"ID":"35c8a9d2-8856-458c-b92d-5c523f98a947","Type":"ContainerStarted","Data":"2190d9d47b5b04e596287aec2d9c59ccb633747c3e40d72f40d96b7fd89bd1c8"} Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.758042 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-6769k"] Dec 03 13:40:56 crc kubenswrapper[4666]: I1203 13:40:56.773704 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5d8dq/crc-debug-6769k"] Dec 03 13:40:57 crc kubenswrapper[4666]: I1203 13:40:57.838504 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:57 crc kubenswrapper[4666]: I1203 13:40:57.974268 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35c8a9d2-8856-458c-b92d-5c523f98a947-host\") pod \"35c8a9d2-8856-458c-b92d-5c523f98a947\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " Dec 03 13:40:57 crc kubenswrapper[4666]: I1203 13:40:57.974322 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7ccn\" (UniqueName: \"kubernetes.io/projected/35c8a9d2-8856-458c-b92d-5c523f98a947-kube-api-access-z7ccn\") pod \"35c8a9d2-8856-458c-b92d-5c523f98a947\" (UID: \"35c8a9d2-8856-458c-b92d-5c523f98a947\") " Dec 03 13:40:57 crc kubenswrapper[4666]: I1203 13:40:57.974613 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35c8a9d2-8856-458c-b92d-5c523f98a947-host" (OuterVolumeSpecName: "host") pod "35c8a9d2-8856-458c-b92d-5c523f98a947" (UID: "35c8a9d2-8856-458c-b92d-5c523f98a947"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:40:57 crc kubenswrapper[4666]: I1203 13:40:57.975224 4666 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35c8a9d2-8856-458c-b92d-5c523f98a947-host\") on node \"crc\" DevicePath \"\"" Dec 03 13:40:57 crc kubenswrapper[4666]: I1203 13:40:57.983907 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c8a9d2-8856-458c-b92d-5c523f98a947-kube-api-access-z7ccn" (OuterVolumeSpecName: "kube-api-access-z7ccn") pod "35c8a9d2-8856-458c-b92d-5c523f98a947" (UID: "35c8a9d2-8856-458c-b92d-5c523f98a947"). InnerVolumeSpecName "kube-api-access-z7ccn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:40:58 crc kubenswrapper[4666]: I1203 13:40:58.077227 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7ccn\" (UniqueName: \"kubernetes.io/projected/35c8a9d2-8856-458c-b92d-5c523f98a947-kube-api-access-z7ccn\") on node \"crc\" DevicePath \"\"" Dec 03 13:40:58 crc kubenswrapper[4666]: I1203 13:40:58.737501 4666 scope.go:117] "RemoveContainer" containerID="b11fb8a51afb79a7c7721736553a328eda181c748cd74f5ab2e493b13ab3a1da" Dec 03 13:40:58 crc kubenswrapper[4666]: I1203 13:40:58.737523 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/crc-debug-6769k" Dec 03 13:40:59 crc kubenswrapper[4666]: I1203 13:40:59.442193 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c8a9d2-8856-458c-b92d-5c523f98a947" path="/var/lib/kubelet/pods/35c8a9d2-8856-458c-b92d-5c523f98a947/volumes" Dec 03 13:41:09 crc kubenswrapper[4666]: I1203 13:41:09.866023 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:41:09 crc kubenswrapper[4666]: I1203 13:41:09.868014 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:41:21 crc kubenswrapper[4666]: I1203 13:41:21.747265 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d6d7b74fd-t9f48_e1a5016b-ec1e-485e-bedf-ced8377f2aae/barbican-api/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.145547 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d6d7b74fd-t9f48_e1a5016b-ec1e-485e-bedf-ced8377f2aae/barbican-api-log/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.208400 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64566d6d86-kczk4_b0043f11-1613-418f-9974-e88038dd7e5e/barbican-keystone-listener/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.439435 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f7f466d4c-4ps5s_7dde289d-753b-4a00-8863-b671281a0bef/barbican-worker/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.485835 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64566d6d86-kczk4_b0043f11-1613-418f-9974-e88038dd7e5e/barbican-keystone-listener-log/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.499356 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f7f466d4c-4ps5s_7dde289d-753b-4a00-8863-b671281a0bef/barbican-worker-log/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.693278 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj_1bcf0d09-1d7c-4a79-a477-f10b1584bc42/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.726528 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/ceilometer-central-agent/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.915495 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/proxy-httpd/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.924809 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/ceilometer-notification-agent/0.log" Dec 03 13:41:22 crc kubenswrapper[4666]: I1203 13:41:22.966503 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/sg-core/0.log" Dec 03 13:41:23 crc kubenswrapper[4666]: I1203 13:41:23.160346 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-29td7_0d695826-87fe-4625-9925-988306a9e16b/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:23 crc kubenswrapper[4666]: I1203 13:41:23.234008 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf_01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:23 crc kubenswrapper[4666]: I1203 13:41:23.421111 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_58b62594-91e8-4cc7-8076-094fba5bcc66/cinder-api-log/0.log" Dec 03 13:41:23 crc kubenswrapper[4666]: I1203 13:41:23.427851 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_58b62594-91e8-4cc7-8076-094fba5bcc66/cinder-api/0.log" Dec 03 13:41:23 crc kubenswrapper[4666]: I1203 13:41:23.640453 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_37430c8d-6678-44c5-a349-8cb94fbb9108/probe/0.log" Dec 03 13:41:23 crc kubenswrapper[4666]: I1203 13:41:23.679126 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_37430c8d-6678-44c5-a349-8cb94fbb9108/cinder-backup/0.log" Dec 03 13:41:23 crc kubenswrapper[4666]: I1203 13:41:23.777040 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b7e78364-3d2e-435a-a0fb-d85cb2586006/cinder-scheduler/0.log" Dec 03 13:41:24 crc kubenswrapper[4666]: I1203 13:41:24.338193 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_978def64-fbe5-4ce2-a2ab-f12bd95ef64a/probe/0.log" Dec 03 13:41:24 crc kubenswrapper[4666]: I1203 13:41:24.390614 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_978def64-fbe5-4ce2-a2ab-f12bd95ef64a/cinder-volume/0.log" Dec 03 13:41:24 crc kubenswrapper[4666]: I1203 13:41:24.422508 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b7e78364-3d2e-435a-a0fb-d85cb2586006/probe/0.log" Dec 03 13:41:24 crc kubenswrapper[4666]: I1203 13:41:24.594187 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs_55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:24 crc kubenswrapper[4666]: I1203 13:41:24.797242 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx_49d2d5f0-5c89-4847-856e-cf9ed17510ec/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:24 crc kubenswrapper[4666]: I1203 13:41:24.821385 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wjfgn_a761ff17-0cbb-43ca-83bc-5fb2b684203f/init/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.107421 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wjfgn_a761ff17-0cbb-43ca-83bc-5fb2b684203f/init/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.114425 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wjfgn_a761ff17-0cbb-43ca-83bc-5fb2b684203f/dnsmasq-dns/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.136838 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f0db6845-8502-4fa7-acdf-20c2395ca177/glance-httpd/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.536659 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f0db6845-8502-4fa7-acdf-20c2395ca177/glance-log/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.547941 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c8c0a38f-bf71-4907-ba78-bef2e7227dc6/glance-httpd/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.580721 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c8c0a38f-bf71-4907-ba78-bef2e7227dc6/glance-log/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.765415 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bd58698c4-v4zw4_a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd/horizon/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.812424 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8_7a99d58b-e139-49f4-8689-faeb388b82ff/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.919009 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bd58698c4-v4zw4_a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd/horizon-log/0.log" Dec 03 13:41:25 crc kubenswrapper[4666]: I1203 13:41:25.999822 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-r2rn2_19109296-aca4-46fa-95d5-70dcd8604ab7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:26 crc kubenswrapper[4666]: I1203 13:41:26.235239 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412781-4pd56_867a9f13-2579-4e34-9c29-97847041400d/keystone-cron/0.log" Dec 03 13:41:26 crc kubenswrapper[4666]: I1203 13:41:26.249689 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d2050e43-459e-42d5-ae48-1e8e03dd089f/kube-state-metrics/0.log" Dec 03 13:41:26 crc kubenswrapper[4666]: I1203 13:41:26.487194 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v_a165368e-be15-48d7-afad-92850b6844ea/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:26 crc kubenswrapper[4666]: I1203 13:41:26.887415 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8015a12c-752c-489f-a52f-da3bf0ab2977/manila-api/0.log" Dec 03 13:41:26 crc kubenswrapper[4666]: I1203 13:41:26.974978 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1aee1c1a-28b6-4db6-b927-a484fa641914/probe/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.041391 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5fdb97596b-722zc_fad25ce8-9656-42a7-bc6a-369e68732b1e/keystone-api/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.081322 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1aee1c1a-28b6-4db6-b927-a484fa641914/manila-scheduler/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.281383 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_11798731-763e-4a1e-97dc-54a4ff717ddf/probe/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.545725 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_11798731-763e-4a1e-97dc-54a4ff717ddf/manila-share/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.573373 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8015a12c-752c-489f-a52f-da3bf0ab2977/manila-api-log/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.670993 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-96d8bfbbf-pd9x2_c6fc5a47-ba09-4985-b0d7-26d824dd60e3/neutron-api/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.814413 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk_109066d4-b3b2-4ec6-ba71-cfc35d9ca300/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:27 crc kubenswrapper[4666]: I1203 13:41:27.868440 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-96d8bfbbf-pd9x2_c6fc5a47-ba09-4985-b0d7-26d824dd60e3/neutron-httpd/0.log" Dec 03 13:41:28 crc kubenswrapper[4666]: I1203 13:41:28.177006 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d91a5463-a0cd-40be-90b0-e01d8f1ebdf3/nova-api-log/0.log" Dec 03 13:41:28 crc kubenswrapper[4666]: I1203 13:41:28.338703 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_aba61f3d-9288-44a6-b194-c136cc1bda0a/nova-cell0-conductor-conductor/0.log" Dec 03 13:41:28 crc kubenswrapper[4666]: I1203 13:41:28.482462 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d91a5463-a0cd-40be-90b0-e01d8f1ebdf3/nova-api-api/0.log" Dec 03 13:41:28 crc kubenswrapper[4666]: I1203 13:41:28.507270 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_31efcd1f-210a-408a-b711-6862b6537a7d/nova-cell1-conductor-conductor/0.log" Dec 03 13:41:28 crc kubenswrapper[4666]: I1203 13:41:28.725195 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_22c93a41-9194-4d6c-a77a-3310870cb513/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 13:41:28 crc kubenswrapper[4666]: I1203 13:41:28.776468 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7_dfec4a43-8c2c-4dab-b3c1-2bc56e71d330/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:29 crc kubenswrapper[4666]: I1203 13:41:29.046753 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8ae0d3e5-4249-417a-aac2-5280115b1213/nova-metadata-log/0.log" Dec 03 13:41:29 crc kubenswrapper[4666]: I1203 13:41:29.197674 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_2a6eed91-42df-4c6b-aaa0-7882ecfb941a/nova-scheduler-scheduler/0.log" Dec 03 13:41:29 crc kubenswrapper[4666]: I1203 13:41:29.312970 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d17da10a-51b4-4ed7-8bbc-2b37be248419/mysql-bootstrap/0.log" Dec 03 13:41:29 crc kubenswrapper[4666]: I1203 13:41:29.672776 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d17da10a-51b4-4ed7-8bbc-2b37be248419/mysql-bootstrap/0.log" Dec 03 13:41:29 crc kubenswrapper[4666]: I1203 13:41:29.747287 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d17da10a-51b4-4ed7-8bbc-2b37be248419/galera/0.log" Dec 03 13:41:29 crc kubenswrapper[4666]: I1203 13:41:29.897366 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_43a43416-3214-46ba-8a00-6939bb265c8a/mysql-bootstrap/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.128508 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_43a43416-3214-46ba-8a00-6939bb265c8a/mysql-bootstrap/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.151521 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_43a43416-3214-46ba-8a00-6939bb265c8a/galera/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.303485 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a7c6b242-ba03-4e43-9061-e908c5af1c78/openstackclient/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.338150 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l9wjt_844dc007-fbd5-4ca4-9f2f-dc3f2382a653/openstack-network-exporter/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.538928 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nsf9r_e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342/ovn-controller/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.631580 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8ae0d3e5-4249-417a-aac2-5280115b1213/nova-metadata-metadata/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.769514 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovsdb-server-init/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.933704 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovsdb-server/0.log" Dec 03 13:41:30 crc kubenswrapper[4666]: I1203 13:41:30.936595 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovsdb-server-init/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.010572 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovs-vswitchd/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.187339 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-8bt2q_1f8dd079-749d-4ad3-8365-eb026d693512/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.248100 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8e9fa52f-aa3b-4705-8a71-e48befc92571/openstack-network-exporter/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.320273 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8e9fa52f-aa3b-4705-8a71-e48befc92571/ovn-northd/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.509740 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55083d6a-bded-48e2-a0ce-3befa24ce873/openstack-network-exporter/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.546108 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55083d6a-bded-48e2-a0ce-3befa24ce873/ovsdbserver-nb/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.711850 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bdd6a24-e604-459e-8eba-ea0d2638fdf5/ovsdbserver-sb/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.793527 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bdd6a24-e604-459e-8eba-ea0d2638fdf5/openstack-network-exporter/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.930563 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9676dbdb4-pcj6f_08966743-608f-40d3-9a26-2515ef964f0f/placement-api/0.log" Dec 03 13:41:31 crc kubenswrapper[4666]: I1203 13:41:31.973121 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9676dbdb4-pcj6f_08966743-608f-40d3-9a26-2515ef964f0f/placement-log/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.011330 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_20f2c961-32c5-4a6e-8d18-5296c889d4a3/setup-container/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.246228 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_20f2c961-32c5-4a6e-8d18-5296c889d4a3/setup-container/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.265985 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_20f2c961-32c5-4a6e-8d18-5296c889d4a3/rabbitmq/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.353013 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a39a37c-b566-4726-8e2b-84be35262830/setup-container/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.553435 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a39a37c-b566-4726-8e2b-84be35262830/setup-container/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.617798 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a39a37c-b566-4726-8e2b-84be35262830/rabbitmq/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.629413 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd_a854a0c8-aad2-4681-9077-c8abd034fa73/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.851945 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j_de91e472-2cc8-4eaf-91a3-49719f18e3f3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:32 crc kubenswrapper[4666]: I1203 13:41:32.923194 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-flnhs_434bedfb-c0c1-45d4-ade6-8e5112122e58/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:33 crc kubenswrapper[4666]: I1203 13:41:33.105249 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-npxb5_15e79024-d1c5-4689-900b-92ded975568d/ssh-known-hosts-edpm-deployment/0.log" Dec 03 13:41:33 crc kubenswrapper[4666]: I1203 13:41:33.313473 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_33fe3655-5d2c-48bd-8a4b-f436570d149c/tempest-tests-tempest-tests-runner/0.log" Dec 03 13:41:33 crc kubenswrapper[4666]: I1203 13:41:33.345966 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_57ab82ad-c603-403b-8a3b-9e32e2ffc1ea/test-operator-logs-container/0.log" Dec 03 13:41:33 crc kubenswrapper[4666]: I1203 13:41:33.551951 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9drzh_acf58997-af21-4832-a74c-f81057c84d08/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:41:39 crc kubenswrapper[4666]: I1203 13:41:39.866237 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:41:39 crc kubenswrapper[4666]: I1203 13:41:39.866831 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:41:39 crc kubenswrapper[4666]: I1203 13:41:39.866884 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:41:39 crc kubenswrapper[4666]: I1203 13:41:39.868327 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:41:39 crc kubenswrapper[4666]: I1203 13:41:39.868390 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" gracePeriod=600 Dec 03 13:41:39 crc kubenswrapper[4666]: E1203 13:41:39.996852 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:41:40 crc kubenswrapper[4666]: I1203 13:41:40.122149 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" exitCode=0 Dec 03 13:41:40 crc kubenswrapper[4666]: I1203 13:41:40.122201 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c"} Dec 03 13:41:40 crc kubenswrapper[4666]: I1203 13:41:40.122239 4666 scope.go:117] "RemoveContainer" containerID="6a464ae0a9428614165c6fb33b80d1aafcc58c94ae1f6920f040bcb85b55acff" Dec 03 13:41:40 crc kubenswrapper[4666]: I1203 13:41:40.122709 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:41:40 crc kubenswrapper[4666]: E1203 13:41:40.122981 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:41:48 crc kubenswrapper[4666]: I1203 13:41:48.816074 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53/memcached/0.log" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.524769 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2744f"] Dec 03 13:41:52 crc kubenswrapper[4666]: E1203 13:41:52.526268 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c8a9d2-8856-458c-b92d-5c523f98a947" containerName="container-00" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.526282 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c8a9d2-8856-458c-b92d-5c523f98a947" containerName="container-00" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.526462 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c8a9d2-8856-458c-b92d-5c523f98a947" containerName="container-00" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.527898 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.545201 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2744f"] Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.704969 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-catalog-content\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.705023 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-utilities\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.705113 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5m8\" (UniqueName: \"kubernetes.io/projected/32e05d12-adab-4860-b058-33a6bbc0fb6c-kube-api-access-xn5m8\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.807081 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-catalog-content\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.807155 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-utilities\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.807237 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn5m8\" (UniqueName: \"kubernetes.io/projected/32e05d12-adab-4860-b058-33a6bbc0fb6c-kube-api-access-xn5m8\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.807634 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-catalog-content\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.807701 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-utilities\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.828219 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn5m8\" (UniqueName: \"kubernetes.io/projected/32e05d12-adab-4860-b058-33a6bbc0fb6c-kube-api-access-xn5m8\") pod \"redhat-marketplace-2744f\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:52 crc kubenswrapper[4666]: I1203 13:41:52.869100 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:41:53 crc kubenswrapper[4666]: I1203 13:41:53.202530 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2744f"] Dec 03 13:41:53 crc kubenswrapper[4666]: I1203 13:41:53.243772 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2744f" event={"ID":"32e05d12-adab-4860-b058-33a6bbc0fb6c","Type":"ContainerStarted","Data":"18eb967cbc7d3d9fffaf9a6a4df1cdab5cf5b1038f39f6faa3db4e3132927445"} Dec 03 13:41:54 crc kubenswrapper[4666]: I1203 13:41:54.254588 4666 generic.go:334] "Generic (PLEG): container finished" podID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerID="cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a" exitCode=0 Dec 03 13:41:54 crc kubenswrapper[4666]: I1203 13:41:54.254915 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2744f" event={"ID":"32e05d12-adab-4860-b058-33a6bbc0fb6c","Type":"ContainerDied","Data":"cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a"} Dec 03 13:41:54 crc kubenswrapper[4666]: I1203 13:41:54.256961 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:41:55 crc kubenswrapper[4666]: I1203 13:41:55.424401 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:41:55 crc kubenswrapper[4666]: E1203 13:41:55.425241 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:41:55 crc kubenswrapper[4666]: E1203 13:41:55.862591 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32e05d12_adab_4860_b058_33a6bbc0fb6c.slice/crio-14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148.scope\": RecentStats: unable to find data in memory cache]" Dec 03 13:41:56 crc kubenswrapper[4666]: I1203 13:41:56.278406 4666 generic.go:334] "Generic (PLEG): container finished" podID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerID="14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148" exitCode=0 Dec 03 13:41:56 crc kubenswrapper[4666]: I1203 13:41:56.278507 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2744f" event={"ID":"32e05d12-adab-4860-b058-33a6bbc0fb6c","Type":"ContainerDied","Data":"14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148"} Dec 03 13:41:57 crc kubenswrapper[4666]: I1203 13:41:57.290854 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2744f" event={"ID":"32e05d12-adab-4860-b058-33a6bbc0fb6c","Type":"ContainerStarted","Data":"0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2"} Dec 03 13:41:57 crc kubenswrapper[4666]: I1203 13:41:57.313137 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2744f" podStartSLOduration=2.861249504 podStartE2EDuration="5.313112593s" podCreationTimestamp="2025-12-03 13:41:52 +0000 UTC" firstStartedPulling="2025-12-03 13:41:54.256695 +0000 UTC m=+5303.101656051" lastFinishedPulling="2025-12-03 13:41:56.708558089 +0000 UTC m=+5305.553519140" observedRunningTime="2025-12-03 13:41:57.308040876 +0000 UTC m=+5306.153001937" watchObservedRunningTime="2025-12-03 13:41:57.313112593 +0000 UTC m=+5306.158073664" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.601114 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rp8qs_72fef244-af95-4c84-889b-04317e2f85e4/kube-rbac-proxy/0.log" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.635856 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-c4plc_75171f12-3098-437a-a941-31312676f362/kube-rbac-proxy/0.log" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.704354 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rp8qs_72fef244-af95-4c84-889b-04317e2f85e4/manager/0.log" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.813631 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-jzzkf_21ae197d-ae5d-4129-b1db-114a42dc5eb8/kube-rbac-proxy/0.log" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.849440 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-c4plc_75171f12-3098-437a-a941-31312676f362/manager/0.log" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.870129 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.870188 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.887017 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-jzzkf_21ae197d-ae5d-4129-b1db-114a42dc5eb8/manager/0.log" Dec 03 13:42:02 crc kubenswrapper[4666]: I1203 13:42:02.918784 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.021518 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/util/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.205233 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/pull/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.230566 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/pull/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.259937 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/util/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.396345 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.409144 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/pull/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.457801 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2744f"] Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.459837 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/util/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.496100 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/extract/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.590528 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-j6dh9_6b5f798a-8be3-4c12-948b-4b9ff35d14ba/kube-rbac-proxy/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.693587 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-j6dh9_6b5f798a-8be3-4c12-948b-4b9ff35d14ba/manager/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.731418 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9cbjr_330ae135-611a-4ae6-ba73-fcb6a911c299/kube-rbac-proxy/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.805260 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9cbjr_330ae135-611a-4ae6-ba73-fcb6a911c299/manager/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.943564 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dn9wg_1d9a62f9-0c20-4033-84d4-ade04922d04a/kube-rbac-proxy/0.log" Dec 03 13:42:03 crc kubenswrapper[4666]: I1203 13:42:03.966602 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dn9wg_1d9a62f9-0c20-4033-84d4-ade04922d04a/manager/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.053024 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-vxrg7_e9197948-361b-43e7-8cc6-db509c80c7b1/kube-rbac-proxy/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.262893 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-97svz_dce8c65f-3951-4e68-a044-c4c59638fd05/manager/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.288405 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-97svz_dce8c65f-3951-4e68-a044-c4c59638fd05/kube-rbac-proxy/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.292881 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-vxrg7_e9197948-361b-43e7-8cc6-db509c80c7b1/manager/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.635145 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-n287g_b3582f8c-2777-4291-bc6a-42953fd2d928/kube-rbac-proxy/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.770403 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-n287g_b3582f8c-2777-4291-bc6a-42953fd2d928/manager/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.858893 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5797d476c-ntgb9_e4686a7d-808f-47e8-b5cd-ec3af299a7f2/kube-rbac-proxy/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.989051 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7lkjq_78813232-79b6-4483-86cb-069995914531/kube-rbac-proxy/0.log" Dec 03 13:42:04 crc kubenswrapper[4666]: I1203 13:42:04.989870 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5797d476c-ntgb9_e4686a7d-808f-47e8-b5cd-ec3af299a7f2/manager/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.094762 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7lkjq_78813232-79b6-4483-86cb-069995914531/manager/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.215406 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-khplb_8048d3e0-a035-4a85-92ad-ca11dc24ccbe/kube-rbac-proxy/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.266158 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-khplb_8048d3e0-a035-4a85-92ad-ca11dc24ccbe/manager/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.362370 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2744f" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="registry-server" containerID="cri-o://0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2" gracePeriod=2 Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.432124 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ckxvk_d90e913f-9878-4644-b0f7-d0e313b8f897/kube-rbac-proxy/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.584601 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ckxvk_d90e913f-9878-4644-b0f7-d0e313b8f897/manager/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.639202 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-q5d7b_5a914e37-4302-4c77-8d4b-6c509dfbfc4e/kube-rbac-proxy/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.678247 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-q5d7b_5a914e37-4302-4c77-8d4b-6c509dfbfc4e/manager/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.812455 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt_865a9d83-50b6-49fb-87f8-c46fa1453ed0/kube-rbac-proxy/0.log" Dec 03 13:42:05 crc kubenswrapper[4666]: I1203 13:42:05.843207 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt_865a9d83-50b6-49fb-87f8-c46fa1453ed0/manager/0.log" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.335672 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f9cd9598-chsns_7aa8983e-49b3-4356-aae3-5388d37ae886/operator/0.log" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.340179 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.384795 4666 generic.go:334] "Generic (PLEG): container finished" podID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerID="0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2" exitCode=0 Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.384842 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2744f" event={"ID":"32e05d12-adab-4860-b058-33a6bbc0fb6c","Type":"ContainerDied","Data":"0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2"} Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.384872 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2744f" event={"ID":"32e05d12-adab-4860-b058-33a6bbc0fb6c","Type":"ContainerDied","Data":"18eb967cbc7d3d9fffaf9a6a4df1cdab5cf5b1038f39f6faa3db4e3132927445"} Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.384895 4666 scope.go:117] "RemoveContainer" containerID="0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.385055 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2744f" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.399028 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-rt2zn_1d364f72-b379-4591-b3f4-17997cbcba6e/kube-rbac-proxy/0.log" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.401870 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-utilities\") pod \"32e05d12-adab-4860-b058-33a6bbc0fb6c\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.401974 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-catalog-content\") pod \"32e05d12-adab-4860-b058-33a6bbc0fb6c\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.402289 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn5m8\" (UniqueName: \"kubernetes.io/projected/32e05d12-adab-4860-b058-33a6bbc0fb6c-kube-api-access-xn5m8\") pod \"32e05d12-adab-4860-b058-33a6bbc0fb6c\" (UID: \"32e05d12-adab-4860-b058-33a6bbc0fb6c\") " Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.406844 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-utilities" (OuterVolumeSpecName: "utilities") pod "32e05d12-adab-4860-b058-33a6bbc0fb6c" (UID: "32e05d12-adab-4860-b058-33a6bbc0fb6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.410007 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e05d12-adab-4860-b058-33a6bbc0fb6c-kube-api-access-xn5m8" (OuterVolumeSpecName: "kube-api-access-xn5m8") pod "32e05d12-adab-4860-b058-33a6bbc0fb6c" (UID: "32e05d12-adab-4860-b058-33a6bbc0fb6c"). InnerVolumeSpecName "kube-api-access-xn5m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.411312 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-z7hzf_04142222-9a39-4c8f-81b1-df4035625463/registry-server/0.log" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.418932 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32e05d12-adab-4860-b058-33a6bbc0fb6c" (UID: "32e05d12-adab-4860-b058-33a6bbc0fb6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.421298 4666 scope.go:117] "RemoveContainer" containerID="14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.490388 4666 scope.go:117] "RemoveContainer" containerID="cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.508849 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn5m8\" (UniqueName: \"kubernetes.io/projected/32e05d12-adab-4860-b058-33a6bbc0fb6c-kube-api-access-xn5m8\") on node \"crc\" DevicePath \"\"" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.508882 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.508894 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e05d12-adab-4860-b058-33a6bbc0fb6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.548567 4666 scope.go:117] "RemoveContainer" containerID="0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2" Dec 03 13:42:06 crc kubenswrapper[4666]: E1203 13:42:06.556910 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2\": container with ID starting with 0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2 not found: ID does not exist" containerID="0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.556954 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2"} err="failed to get container status \"0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2\": rpc error: code = NotFound desc = could not find container \"0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2\": container with ID starting with 0c72439202f7909ac02424ea74bc9aee6fc63af1e2e7e1679875902c426b29f2 not found: ID does not exist" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.556979 4666 scope.go:117] "RemoveContainer" containerID="14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148" Dec 03 13:42:06 crc kubenswrapper[4666]: E1203 13:42:06.559036 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148\": container with ID starting with 14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148 not found: ID does not exist" containerID="14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.559070 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148"} err="failed to get container status \"14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148\": rpc error: code = NotFound desc = could not find container \"14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148\": container with ID starting with 14fd7ae18f56d81fc15c090bf2f80d25b648a8529a018340997a80da3ce6e148 not found: ID does not exist" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.559113 4666 scope.go:117] "RemoveContainer" containerID="cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a" Dec 03 13:42:06 crc kubenswrapper[4666]: E1203 13:42:06.561819 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a\": container with ID starting with cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a not found: ID does not exist" containerID="cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.561850 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a"} err="failed to get container status \"cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a\": rpc error: code = NotFound desc = could not find container \"cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a\": container with ID starting with cdd47a632d081e078eb85d020e510c41c348e65780d1a53bdf57a84c056e368a not found: ID does not exist" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.675145 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-rt2zn_1d364f72-b379-4591-b3f4-17997cbcba6e/manager/0.log" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.761425 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bwxhd_deefb3d8-d96a-4e86-839d-8d8a561f4645/kube-rbac-proxy/0.log" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.762381 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2744f"] Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.776608 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2744f"] Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.787360 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bwxhd_deefb3d8-d96a-4e86-839d-8d8a561f4645/manager/0.log" Dec 03 13:42:06 crc kubenswrapper[4666]: I1203 13:42:06.978941 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-d49kn_e0637cb9-5703-4e26-b526-592b818a5304/operator/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.016954 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-x2zqw_20594b02-a42f-4747-abfc-cbee34847d81/kube-rbac-proxy/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.195160 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-x2zqw_20594b02-a42f-4747-abfc-cbee34847d81/manager/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.286821 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-bcvbt_494c67d4-f61e-468c-a8d8-21a877c690e8/kube-rbac-proxy/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.379598 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-bcvbt_494c67d4-f61e-468c-a8d8-21a877c690e8/manager/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.411686 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54468f9998-5pr6c_ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45/manager/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.442257 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" path="/var/lib/kubelet/pods/32e05d12-adab-4860-b058-33a6bbc0fb6c/volumes" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.530289 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xwp75_39681ef6-2d50-4509-a81e-d6cd102695cd/manager/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.543775 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xwp75_39681ef6-2d50-4509-a81e-d6cd102695cd/kube-rbac-proxy/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.656227 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xmmjr_f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f/kube-rbac-proxy/0.log" Dec 03 13:42:07 crc kubenswrapper[4666]: I1203 13:42:07.662292 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xmmjr_f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f/manager/0.log" Dec 03 13:42:08 crc kubenswrapper[4666]: I1203 13:42:08.424153 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:42:08 crc kubenswrapper[4666]: E1203 13:42:08.424537 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:42:19 crc kubenswrapper[4666]: I1203 13:42:19.423304 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:42:19 crc kubenswrapper[4666]: E1203 13:42:19.424113 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:42:26 crc kubenswrapper[4666]: I1203 13:42:26.335446 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bsdf6_61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae/control-plane-machine-set-operator/0.log" Dec 03 13:42:26 crc kubenswrapper[4666]: I1203 13:42:26.493964 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dn4sx_7d685477-11f2-4bfb-98c2-6eb76b6697c3/kube-rbac-proxy/0.log" Dec 03 13:42:26 crc kubenswrapper[4666]: I1203 13:42:26.525205 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dn4sx_7d685477-11f2-4bfb-98c2-6eb76b6697c3/machine-api-operator/0.log" Dec 03 13:42:33 crc kubenswrapper[4666]: I1203 13:42:33.423772 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:42:33 crc kubenswrapper[4666]: E1203 13:42:33.424818 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:42:38 crc kubenswrapper[4666]: I1203 13:42:38.055220 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-q94td_100d1193-8c3e-442e-8e9c-9983b5292555/cert-manager-controller/0.log" Dec 03 13:42:38 crc kubenswrapper[4666]: I1203 13:42:38.237594 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-g7hwp_9dafb972-28e7-4392-9d8f-0d6036c5adab/cert-manager-cainjector/0.log" Dec 03 13:42:38 crc kubenswrapper[4666]: I1203 13:42:38.251176 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nf766_b7fc3a3d-9867-4055-b071-c43574b66e7a/cert-manager-webhook/0.log" Dec 03 13:42:47 crc kubenswrapper[4666]: I1203 13:42:47.424302 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:42:47 crc kubenswrapper[4666]: E1203 13:42:47.425293 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:42:49 crc kubenswrapper[4666]: I1203 13:42:49.602852 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-m4v87_8f7dae47-5e1c-4945-9827-33a00c4c0d66/nmstate-console-plugin/0.log" Dec 03 13:42:49 crc kubenswrapper[4666]: I1203 13:42:49.774377 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7x7ll_02c230c0-43ab-4476-b3dd-64cb686195c0/nmstate-handler/0.log" Dec 03 13:42:49 crc kubenswrapper[4666]: I1203 13:42:49.784863 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5gp27_65d7f250-10bf-4a17-879f-856d2ea16b91/kube-rbac-proxy/0.log" Dec 03 13:42:49 crc kubenswrapper[4666]: I1203 13:42:49.825445 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5gp27_65d7f250-10bf-4a17-879f-856d2ea16b91/nmstate-metrics/0.log" Dec 03 13:42:49 crc kubenswrapper[4666]: I1203 13:42:49.954421 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-wqnxn_c28a05b1-6eb0-43a1-a581-c8a5f3b956b6/nmstate-operator/0.log" Dec 03 13:42:50 crc kubenswrapper[4666]: I1203 13:42:50.023180 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-vvj59_f43cdd10-999d-470a-89d0-909660ec7e67/nmstate-webhook/0.log" Dec 03 13:43:00 crc kubenswrapper[4666]: I1203 13:43:00.423715 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:43:00 crc kubenswrapper[4666]: E1203 13:43:00.424618 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.160667 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-4zn7s_782afea8-e67f-4724-992b-6d318c9f9e5c/kube-rbac-proxy/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.269419 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-4zn7s_782afea8-e67f-4724-992b-6d318c9f9e5c/controller/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.403806 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-xjlzk_eaa2e763-b8bc-4f22-9bbf-43d36d8c2088/frr-k8s-webhook-server/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.449958 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.655606 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.675927 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.686544 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.733708 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.881266 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.884491 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.904022 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:43:04 crc kubenswrapper[4666]: I1203 13:43:04.963701 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.148349 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.151859 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.152180 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/controller/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.169005 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.324958 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/frr-metrics/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.362297 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/kube-rbac-proxy-frr/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.363244 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/kube-rbac-proxy/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.534585 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/reloader/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.585340 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-b8d9b7676-d2hwb_c594fca4-0d6a-47e1-acc4-b9434ce17bb9/manager/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.826196 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-868f45c797-svsbn_9606ffd7-351c-4485-b17a-779a724a1859/webhook-server/0.log" Dec 03 13:43:05 crc kubenswrapper[4666]: I1203 13:43:05.997897 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27md4_c1d0b522-d828-4f16-9d3b-64b16697898a/kube-rbac-proxy/0.log" Dec 03 13:43:06 crc kubenswrapper[4666]: I1203 13:43:06.657368 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27md4_c1d0b522-d828-4f16-9d3b-64b16697898a/speaker/0.log" Dec 03 13:43:06 crc kubenswrapper[4666]: I1203 13:43:06.850016 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/frr/0.log" Dec 03 13:43:15 crc kubenswrapper[4666]: I1203 13:43:15.424261 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:43:15 crc kubenswrapper[4666]: E1203 13:43:15.425174 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:43:18 crc kubenswrapper[4666]: I1203 13:43:18.607313 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/util/0.log" Dec 03 13:43:18 crc kubenswrapper[4666]: I1203 13:43:18.713285 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/util/0.log" Dec 03 13:43:18 crc kubenswrapper[4666]: I1203 13:43:18.743983 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/pull/0.log" Dec 03 13:43:18 crc kubenswrapper[4666]: I1203 13:43:18.745820 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/pull/0.log" Dec 03 13:43:18 crc kubenswrapper[4666]: I1203 13:43:18.934728 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/util/0.log" Dec 03 13:43:18 crc kubenswrapper[4666]: I1203 13:43:18.946367 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/pull/0.log" Dec 03 13:43:18 crc kubenswrapper[4666]: I1203 13:43:18.980069 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/extract/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.111045 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/util/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.264996 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/pull/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.267357 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/pull/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.279183 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/util/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.497133 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/pull/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.500698 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/util/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.511187 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/extract/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.661939 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-utilities/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.813165 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-utilities/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.840752 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-content/0.log" Dec 03 13:43:19 crc kubenswrapper[4666]: I1203 13:43:19.896839 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-content/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.054025 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-content/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.075486 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-utilities/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.278642 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-utilities/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.588745 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-utilities/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.617370 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-content/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.644708 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/registry-server/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.658331 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-content/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.823885 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-utilities/0.log" Dec 03 13:43:20 crc kubenswrapper[4666]: I1203 13:43:20.835154 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-content/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.101962 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2vrds_d829bfc6-3cf6-4b30-a501-1586386d7698/marketplace-operator/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.158338 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-utilities/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.398153 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-content/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.464019 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-utilities/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.483326 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-content/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.566038 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/registry-server/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.635791 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-utilities/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.678156 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-content/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.808604 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/registry-server/0.log" Dec 03 13:43:21 crc kubenswrapper[4666]: I1203 13:43:21.877395 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-utilities/0.log" Dec 03 13:43:22 crc kubenswrapper[4666]: I1203 13:43:22.034199 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-utilities/0.log" Dec 03 13:43:22 crc kubenswrapper[4666]: I1203 13:43:22.050370 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-content/0.log" Dec 03 13:43:22 crc kubenswrapper[4666]: I1203 13:43:22.063908 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-content/0.log" Dec 03 13:43:22 crc kubenswrapper[4666]: I1203 13:43:22.212420 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-utilities/0.log" Dec 03 13:43:22 crc kubenswrapper[4666]: I1203 13:43:22.260438 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-content/0.log" Dec 03 13:43:22 crc kubenswrapper[4666]: I1203 13:43:22.841001 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/registry-server/0.log" Dec 03 13:43:26 crc kubenswrapper[4666]: I1203 13:43:26.424246 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:43:26 crc kubenswrapper[4666]: E1203 13:43:26.424972 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:43:38 crc kubenswrapper[4666]: I1203 13:43:38.423359 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:43:38 crc kubenswrapper[4666]: E1203 13:43:38.423946 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:43:50 crc kubenswrapper[4666]: I1203 13:43:50.423810 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:43:50 crc kubenswrapper[4666]: E1203 13:43:50.424968 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:44:03 crc kubenswrapper[4666]: I1203 13:44:03.423827 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:44:03 crc kubenswrapper[4666]: E1203 13:44:03.425753 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:44:17 crc kubenswrapper[4666]: I1203 13:44:17.424789 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:44:17 crc kubenswrapper[4666]: E1203 13:44:17.425803 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:44:31 crc kubenswrapper[4666]: I1203 13:44:31.428332 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:44:31 crc kubenswrapper[4666]: E1203 13:44:31.428940 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:44:43 crc kubenswrapper[4666]: I1203 13:44:43.423708 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:44:43 crc kubenswrapper[4666]: E1203 13:44:43.424681 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:44:57 crc kubenswrapper[4666]: I1203 13:44:57.425020 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:44:57 crc kubenswrapper[4666]: E1203 13:44:57.425964 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.160922 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np"] Dec 03 13:45:00 crc kubenswrapper[4666]: E1203 13:45:00.161685 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="extract-utilities" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.161702 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="extract-utilities" Dec 03 13:45:00 crc kubenswrapper[4666]: E1203 13:45:00.161731 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="registry-server" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.161738 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="registry-server" Dec 03 13:45:00 crc kubenswrapper[4666]: E1203 13:45:00.161771 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="extract-content" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.161778 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="extract-content" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.162111 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e05d12-adab-4860-b058-33a6bbc0fb6c" containerName="registry-server" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.162766 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.165134 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.166452 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21701ab1-4bef-4eec-92d8-c1705b99396a-secret-volume\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.166521 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.166553 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnk9\" (UniqueName: \"kubernetes.io/projected/21701ab1-4bef-4eec-92d8-c1705b99396a-kube-api-access-knnk9\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.166610 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21701ab1-4bef-4eec-92d8-c1705b99396a-config-volume\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.177755 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np"] Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.268236 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21701ab1-4bef-4eec-92d8-c1705b99396a-secret-volume\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.268537 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knnk9\" (UniqueName: \"kubernetes.io/projected/21701ab1-4bef-4eec-92d8-c1705b99396a-kube-api-access-knnk9\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.268574 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21701ab1-4bef-4eec-92d8-c1705b99396a-config-volume\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.271599 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21701ab1-4bef-4eec-92d8-c1705b99396a-config-volume\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.275541 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21701ab1-4bef-4eec-92d8-c1705b99396a-secret-volume\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.289610 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnk9\" (UniqueName: \"kubernetes.io/projected/21701ab1-4bef-4eec-92d8-c1705b99396a-kube-api-access-knnk9\") pod \"collect-profiles-29412825-pz4np\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.495637 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:00 crc kubenswrapper[4666]: I1203 13:45:00.979288 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np"] Dec 03 13:45:01 crc kubenswrapper[4666]: I1203 13:45:01.091174 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" event={"ID":"21701ab1-4bef-4eec-92d8-c1705b99396a","Type":"ContainerStarted","Data":"b4580ad9dec8a9fddb0470ff4230ccbcb84bff36bfebfaf8b148b18ea136e9f0"} Dec 03 13:45:02 crc kubenswrapper[4666]: I1203 13:45:02.100929 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" event={"ID":"21701ab1-4bef-4eec-92d8-c1705b99396a","Type":"ContainerStarted","Data":"8c7870d053f60a52840ada3af24db3cc2a198d97c977d570722a12d5ad57d21c"} Dec 03 13:45:02 crc kubenswrapper[4666]: I1203 13:45:02.131230 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" podStartSLOduration=2.131199157 podStartE2EDuration="2.131199157s" podCreationTimestamp="2025-12-03 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:45:02.119931124 +0000 UTC m=+5490.964892235" watchObservedRunningTime="2025-12-03 13:45:02.131199157 +0000 UTC m=+5490.976160238" Dec 03 13:45:03 crc kubenswrapper[4666]: I1203 13:45:03.116059 4666 generic.go:334] "Generic (PLEG): container finished" podID="21701ab1-4bef-4eec-92d8-c1705b99396a" containerID="8c7870d053f60a52840ada3af24db3cc2a198d97c977d570722a12d5ad57d21c" exitCode=0 Dec 03 13:45:03 crc kubenswrapper[4666]: I1203 13:45:03.116206 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" event={"ID":"21701ab1-4bef-4eec-92d8-c1705b99396a","Type":"ContainerDied","Data":"8c7870d053f60a52840ada3af24db3cc2a198d97c977d570722a12d5ad57d21c"} Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.513823 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.665282 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21701ab1-4bef-4eec-92d8-c1705b99396a-config-volume\") pod \"21701ab1-4bef-4eec-92d8-c1705b99396a\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.665409 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21701ab1-4bef-4eec-92d8-c1705b99396a-secret-volume\") pod \"21701ab1-4bef-4eec-92d8-c1705b99396a\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.665609 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knnk9\" (UniqueName: \"kubernetes.io/projected/21701ab1-4bef-4eec-92d8-c1705b99396a-kube-api-access-knnk9\") pod \"21701ab1-4bef-4eec-92d8-c1705b99396a\" (UID: \"21701ab1-4bef-4eec-92d8-c1705b99396a\") " Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.666650 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21701ab1-4bef-4eec-92d8-c1705b99396a-config-volume" (OuterVolumeSpecName: "config-volume") pod "21701ab1-4bef-4eec-92d8-c1705b99396a" (UID: "21701ab1-4bef-4eec-92d8-c1705b99396a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.670716 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21701ab1-4bef-4eec-92d8-c1705b99396a-kube-api-access-knnk9" (OuterVolumeSpecName: "kube-api-access-knnk9") pod "21701ab1-4bef-4eec-92d8-c1705b99396a" (UID: "21701ab1-4bef-4eec-92d8-c1705b99396a"). InnerVolumeSpecName "kube-api-access-knnk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.677279 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21701ab1-4bef-4eec-92d8-c1705b99396a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "21701ab1-4bef-4eec-92d8-c1705b99396a" (UID: "21701ab1-4bef-4eec-92d8-c1705b99396a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.767620 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21701ab1-4bef-4eec-92d8-c1705b99396a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.767651 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/21701ab1-4bef-4eec-92d8-c1705b99396a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 13:45:04 crc kubenswrapper[4666]: I1203 13:45:04.767661 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knnk9\" (UniqueName: \"kubernetes.io/projected/21701ab1-4bef-4eec-92d8-c1705b99396a-kube-api-access-knnk9\") on node \"crc\" DevicePath \"\"" Dec 03 13:45:05 crc kubenswrapper[4666]: I1203 13:45:05.145903 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" event={"ID":"21701ab1-4bef-4eec-92d8-c1705b99396a","Type":"ContainerDied","Data":"b4580ad9dec8a9fddb0470ff4230ccbcb84bff36bfebfaf8b148b18ea136e9f0"} Dec 03 13:45:05 crc kubenswrapper[4666]: I1203 13:45:05.145947 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412825-pz4np" Dec 03 13:45:05 crc kubenswrapper[4666]: I1203 13:45:05.145954 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4580ad9dec8a9fddb0470ff4230ccbcb84bff36bfebfaf8b148b18ea136e9f0" Dec 03 13:45:05 crc kubenswrapper[4666]: I1203 13:45:05.584597 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5"] Dec 03 13:45:05 crc kubenswrapper[4666]: I1203 13:45:05.594915 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412780-bjwt5"] Dec 03 13:45:07 crc kubenswrapper[4666]: I1203 13:45:07.441820 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870f863c-0bf4-437b-9c21-90e68cea84de" path="/var/lib/kubelet/pods/870f863c-0bf4-437b-9c21-90e68cea84de/volumes" Dec 03 13:45:09 crc kubenswrapper[4666]: I1203 13:45:09.426013 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:45:09 crc kubenswrapper[4666]: E1203 13:45:09.427267 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:45:16 crc kubenswrapper[4666]: I1203 13:45:16.248076 4666 generic.go:334] "Generic (PLEG): container finished" podID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerID="c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c" exitCode=0 Dec 03 13:45:16 crc kubenswrapper[4666]: I1203 13:45:16.248172 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" event={"ID":"d6fff386-9ad2-4f14-9c07-772e90f24102","Type":"ContainerDied","Data":"c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c"} Dec 03 13:45:16 crc kubenswrapper[4666]: I1203 13:45:16.249249 4666 scope.go:117] "RemoveContainer" containerID="c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c" Dec 03 13:45:16 crc kubenswrapper[4666]: I1203 13:45:16.609147 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5d8dq_must-gather-w2l7k_d6fff386-9ad2-4f14-9c07-772e90f24102/gather/0.log" Dec 03 13:45:20 crc kubenswrapper[4666]: I1203 13:45:20.424203 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:45:20 crc kubenswrapper[4666]: E1203 13:45:20.424897 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:45:25 crc kubenswrapper[4666]: I1203 13:45:25.480953 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5d8dq/must-gather-w2l7k"] Dec 03 13:45:25 crc kubenswrapper[4666]: I1203 13:45:25.482019 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerName="copy" containerID="cri-o://639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05" gracePeriod=2 Dec 03 13:45:25 crc kubenswrapper[4666]: I1203 13:45:25.489396 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5d8dq/must-gather-w2l7k"] Dec 03 13:45:25 crc kubenswrapper[4666]: I1203 13:45:25.913857 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5d8dq_must-gather-w2l7k_d6fff386-9ad2-4f14-9c07-772e90f24102/copy/0.log" Dec 03 13:45:25 crc kubenswrapper[4666]: I1203 13:45:25.914713 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.027804 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6fff386-9ad2-4f14-9c07-772e90f24102-must-gather-output\") pod \"d6fff386-9ad2-4f14-9c07-772e90f24102\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.027927 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsgx2\" (UniqueName: \"kubernetes.io/projected/d6fff386-9ad2-4f14-9c07-772e90f24102-kube-api-access-fsgx2\") pod \"d6fff386-9ad2-4f14-9c07-772e90f24102\" (UID: \"d6fff386-9ad2-4f14-9c07-772e90f24102\") " Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.035374 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fff386-9ad2-4f14-9c07-772e90f24102-kube-api-access-fsgx2" (OuterVolumeSpecName: "kube-api-access-fsgx2") pod "d6fff386-9ad2-4f14-9c07-772e90f24102" (UID: "d6fff386-9ad2-4f14-9c07-772e90f24102"). InnerVolumeSpecName "kube-api-access-fsgx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.131368 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsgx2\" (UniqueName: \"kubernetes.io/projected/d6fff386-9ad2-4f14-9c07-772e90f24102-kube-api-access-fsgx2\") on node \"crc\" DevicePath \"\"" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.184244 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fff386-9ad2-4f14-9c07-772e90f24102-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d6fff386-9ad2-4f14-9c07-772e90f24102" (UID: "d6fff386-9ad2-4f14-9c07-772e90f24102"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.233622 4666 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6fff386-9ad2-4f14-9c07-772e90f24102-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.351644 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5d8dq_must-gather-w2l7k_d6fff386-9ad2-4f14-9c07-772e90f24102/copy/0.log" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.352036 4666 generic.go:334] "Generic (PLEG): container finished" podID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerID="639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05" exitCode=143 Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.352146 4666 scope.go:117] "RemoveContainer" containerID="639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.352194 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5d8dq/must-gather-w2l7k" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.374151 4666 scope.go:117] "RemoveContainer" containerID="c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.424406 4666 scope.go:117] "RemoveContainer" containerID="639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05" Dec 03 13:45:26 crc kubenswrapper[4666]: E1203 13:45:26.425209 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05\": container with ID starting with 639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05 not found: ID does not exist" containerID="639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.425250 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05"} err="failed to get container status \"639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05\": rpc error: code = NotFound desc = could not find container \"639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05\": container with ID starting with 639e62f9f8c68bb421d67843d75f30b985fc7d9e6411bcc6451b17615f93ab05 not found: ID does not exist" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.425276 4666 scope.go:117] "RemoveContainer" containerID="c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c" Dec 03 13:45:26 crc kubenswrapper[4666]: E1203 13:45:26.425763 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c\": container with ID starting with c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c not found: ID does not exist" containerID="c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c" Dec 03 13:45:26 crc kubenswrapper[4666]: I1203 13:45:26.425797 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c"} err="failed to get container status \"c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c\": rpc error: code = NotFound desc = could not find container \"c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c\": container with ID starting with c94a063c730b6701830c1d1fa4c7134dca35045f05bced088db7fe86d3507b2c not found: ID does not exist" Dec 03 13:45:27 crc kubenswrapper[4666]: I1203 13:45:27.437559 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" path="/var/lib/kubelet/pods/d6fff386-9ad2-4f14-9c07-772e90f24102/volumes" Dec 03 13:45:33 crc kubenswrapper[4666]: I1203 13:45:33.424390 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:45:33 crc kubenswrapper[4666]: E1203 13:45:33.427726 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:45:48 crc kubenswrapper[4666]: I1203 13:45:48.424307 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:45:48 crc kubenswrapper[4666]: E1203 13:45:48.425169 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.089111 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvzkh"] Dec 03 13:45:49 crc kubenswrapper[4666]: E1203 13:45:49.089508 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21701ab1-4bef-4eec-92d8-c1705b99396a" containerName="collect-profiles" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.089524 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="21701ab1-4bef-4eec-92d8-c1705b99396a" containerName="collect-profiles" Dec 03 13:45:49 crc kubenswrapper[4666]: E1203 13:45:49.089546 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerName="gather" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.089552 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerName="gather" Dec 03 13:45:49 crc kubenswrapper[4666]: E1203 13:45:49.089569 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerName="copy" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.089575 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerName="copy" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.106039 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="21701ab1-4bef-4eec-92d8-c1705b99396a" containerName="collect-profiles" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.106112 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerName="copy" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.106148 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fff386-9ad2-4f14-9c07-772e90f24102" containerName="gather" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.110884 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.111158 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvzkh"] Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.226184 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skdlm\" (UniqueName: \"kubernetes.io/projected/20cadb85-00d0-488b-835c-f3f7b1378cd1-kube-api-access-skdlm\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.226529 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-utilities\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.226576 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-catalog-content\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.328152 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skdlm\" (UniqueName: \"kubernetes.io/projected/20cadb85-00d0-488b-835c-f3f7b1378cd1-kube-api-access-skdlm\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.328261 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-utilities\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.328310 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-catalog-content\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.328899 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-utilities\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.328909 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-catalog-content\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.347590 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skdlm\" (UniqueName: \"kubernetes.io/projected/20cadb85-00d0-488b-835c-f3f7b1378cd1-kube-api-access-skdlm\") pod \"certified-operators-rvzkh\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.440690 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:49 crc kubenswrapper[4666]: I1203 13:45:49.964167 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvzkh"] Dec 03 13:45:50 crc kubenswrapper[4666]: I1203 13:45:50.581591 4666 generic.go:334] "Generic (PLEG): container finished" podID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerID="093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5" exitCode=0 Dec 03 13:45:50 crc kubenswrapper[4666]: I1203 13:45:50.581686 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvzkh" event={"ID":"20cadb85-00d0-488b-835c-f3f7b1378cd1","Type":"ContainerDied","Data":"093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5"} Dec 03 13:45:50 crc kubenswrapper[4666]: I1203 13:45:50.581877 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvzkh" event={"ID":"20cadb85-00d0-488b-835c-f3f7b1378cd1","Type":"ContainerStarted","Data":"1c91d065a330b11dd0dfec684c58c5e0b18907d484608e9585793b3ea23c5109"} Dec 03 13:45:52 crc kubenswrapper[4666]: I1203 13:45:52.598547 4666 generic.go:334] "Generic (PLEG): container finished" podID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerID="1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace" exitCode=0 Dec 03 13:45:52 crc kubenswrapper[4666]: I1203 13:45:52.598664 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvzkh" event={"ID":"20cadb85-00d0-488b-835c-f3f7b1378cd1","Type":"ContainerDied","Data":"1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace"} Dec 03 13:45:53 crc kubenswrapper[4666]: I1203 13:45:53.608981 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvzkh" event={"ID":"20cadb85-00d0-488b-835c-f3f7b1378cd1","Type":"ContainerStarted","Data":"1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164"} Dec 03 13:45:53 crc kubenswrapper[4666]: I1203 13:45:53.628687 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvzkh" podStartSLOduration=2.147115714 podStartE2EDuration="4.628670082s" podCreationTimestamp="2025-12-03 13:45:49 +0000 UTC" firstStartedPulling="2025-12-03 13:45:50.584622503 +0000 UTC m=+5539.429583554" lastFinishedPulling="2025-12-03 13:45:53.066176871 +0000 UTC m=+5541.911137922" observedRunningTime="2025-12-03 13:45:53.624060378 +0000 UTC m=+5542.469021429" watchObservedRunningTime="2025-12-03 13:45:53.628670082 +0000 UTC m=+5542.473631133" Dec 03 13:45:54 crc kubenswrapper[4666]: I1203 13:45:54.085645 4666 scope.go:117] "RemoveContainer" containerID="d23087e2b27dfb7d67f23c16864bdd8a4ab38019f28f75d3cf8402f85a495845" Dec 03 13:45:54 crc kubenswrapper[4666]: I1203 13:45:54.104115 4666 scope.go:117] "RemoveContainer" containerID="39773b043e319b66536d052b352275f429a53be12ec99508c08ed24e1a8ecc1b" Dec 03 13:45:59 crc kubenswrapper[4666]: I1203 13:45:59.441438 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:59 crc kubenswrapper[4666]: I1203 13:45:59.441973 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:59 crc kubenswrapper[4666]: I1203 13:45:59.490592 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:59 crc kubenswrapper[4666]: I1203 13:45:59.704288 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:45:59 crc kubenswrapper[4666]: I1203 13:45:59.755071 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvzkh"] Dec 03 13:46:01 crc kubenswrapper[4666]: I1203 13:46:01.674023 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvzkh" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="registry-server" containerID="cri-o://1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164" gracePeriod=2 Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.196327 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.303507 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-catalog-content\") pod \"20cadb85-00d0-488b-835c-f3f7b1378cd1\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.303602 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-utilities\") pod \"20cadb85-00d0-488b-835c-f3f7b1378cd1\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.303692 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skdlm\" (UniqueName: \"kubernetes.io/projected/20cadb85-00d0-488b-835c-f3f7b1378cd1-kube-api-access-skdlm\") pod \"20cadb85-00d0-488b-835c-f3f7b1378cd1\" (UID: \"20cadb85-00d0-488b-835c-f3f7b1378cd1\") " Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.304879 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-utilities" (OuterVolumeSpecName: "utilities") pod "20cadb85-00d0-488b-835c-f3f7b1378cd1" (UID: "20cadb85-00d0-488b-835c-f3f7b1378cd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.310606 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cadb85-00d0-488b-835c-f3f7b1378cd1-kube-api-access-skdlm" (OuterVolumeSpecName: "kube-api-access-skdlm") pod "20cadb85-00d0-488b-835c-f3f7b1378cd1" (UID: "20cadb85-00d0-488b-835c-f3f7b1378cd1"). InnerVolumeSpecName "kube-api-access-skdlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.364186 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20cadb85-00d0-488b-835c-f3f7b1378cd1" (UID: "20cadb85-00d0-488b-835c-f3f7b1378cd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.406019 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.406048 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cadb85-00d0-488b-835c-f3f7b1378cd1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.406059 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skdlm\" (UniqueName: \"kubernetes.io/projected/20cadb85-00d0-488b-835c-f3f7b1378cd1-kube-api-access-skdlm\") on node \"crc\" DevicePath \"\"" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.685324 4666 generic.go:334] "Generic (PLEG): container finished" podID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerID="1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164" exitCode=0 Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.685370 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvzkh" event={"ID":"20cadb85-00d0-488b-835c-f3f7b1378cd1","Type":"ContainerDied","Data":"1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164"} Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.685462 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvzkh" event={"ID":"20cadb85-00d0-488b-835c-f3f7b1378cd1","Type":"ContainerDied","Data":"1c91d065a330b11dd0dfec684c58c5e0b18907d484608e9585793b3ea23c5109"} Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.685474 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvzkh" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.685493 4666 scope.go:117] "RemoveContainer" containerID="1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.720129 4666 scope.go:117] "RemoveContainer" containerID="1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.741126 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvzkh"] Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.749346 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvzkh"] Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.768684 4666 scope.go:117] "RemoveContainer" containerID="093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.795830 4666 scope.go:117] "RemoveContainer" containerID="1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164" Dec 03 13:46:02 crc kubenswrapper[4666]: E1203 13:46:02.796468 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164\": container with ID starting with 1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164 not found: ID does not exist" containerID="1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.796517 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164"} err="failed to get container status \"1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164\": rpc error: code = NotFound desc = could not find container \"1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164\": container with ID starting with 1ce052e42d4e4b921e0dc353767852ffdc4fcdeb36daa2e81444b736fa6f0164 not found: ID does not exist" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.796546 4666 scope.go:117] "RemoveContainer" containerID="1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace" Dec 03 13:46:02 crc kubenswrapper[4666]: E1203 13:46:02.797184 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace\": container with ID starting with 1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace not found: ID does not exist" containerID="1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.797234 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace"} err="failed to get container status \"1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace\": rpc error: code = NotFound desc = could not find container \"1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace\": container with ID starting with 1070636b69ee2064bae9d9765bfb9a789e9edc30d0bdd5631be4c2baab0e8ace not found: ID does not exist" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.797268 4666 scope.go:117] "RemoveContainer" containerID="093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5" Dec 03 13:46:02 crc kubenswrapper[4666]: E1203 13:46:02.797664 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5\": container with ID starting with 093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5 not found: ID does not exist" containerID="093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5" Dec 03 13:46:02 crc kubenswrapper[4666]: I1203 13:46:02.797824 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5"} err="failed to get container status \"093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5\": rpc error: code = NotFound desc = could not find container \"093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5\": container with ID starting with 093c49fba16942bdf86764f9dd5964ddfc492f23c704ad131bbb342f628d57d5 not found: ID does not exist" Dec 03 13:46:03 crc kubenswrapper[4666]: I1203 13:46:03.423781 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:46:03 crc kubenswrapper[4666]: E1203 13:46:03.424115 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:46:03 crc kubenswrapper[4666]: I1203 13:46:03.434362 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" path="/var/lib/kubelet/pods/20cadb85-00d0-488b-835c-f3f7b1378cd1/volumes" Dec 03 13:46:16 crc kubenswrapper[4666]: I1203 13:46:16.424427 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:46:16 crc kubenswrapper[4666]: E1203 13:46:16.425518 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:46:27 crc kubenswrapper[4666]: I1203 13:46:27.424984 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:46:27 crc kubenswrapper[4666]: E1203 13:46:27.426067 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:46:40 crc kubenswrapper[4666]: I1203 13:46:40.423689 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:46:41 crc kubenswrapper[4666]: I1203 13:46:41.102186 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"a9cc00e19a46e3efb1f89908932881acb239a1fb796cadd29fdfb9a733ca6dd2"} Dec 03 13:46:54 crc kubenswrapper[4666]: I1203 13:46:54.208485 4666 scope.go:117] "RemoveContainer" containerID="7b3ddb084a6f78a43d6ab9268f4548253cd2a7b7a0348721a15e30d762270760" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.178703 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7gq9q"] Dec 03 13:47:20 crc kubenswrapper[4666]: E1203 13:47:20.184115 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="extract-content" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.184250 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="extract-content" Dec 03 13:47:20 crc kubenswrapper[4666]: E1203 13:47:20.184339 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="registry-server" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.184414 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="registry-server" Dec 03 13:47:20 crc kubenswrapper[4666]: E1203 13:47:20.184501 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="extract-utilities" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.184572 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="extract-utilities" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.184891 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cadb85-00d0-488b-835c-f3f7b1378cd1" containerName="registry-server" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.187313 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.193321 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7gq9q"] Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.298712 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62n4\" (UniqueName: \"kubernetes.io/projected/e8e5185a-2d92-4429-927c-d7c9854f7bd1-kube-api-access-g62n4\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.298879 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-catalog-content\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.298946 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-utilities\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.400852 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-utilities\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.400939 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62n4\" (UniqueName: \"kubernetes.io/projected/e8e5185a-2d92-4429-927c-d7c9854f7bd1-kube-api-access-g62n4\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.401032 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-catalog-content\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.401559 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-catalog-content\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.401863 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-utilities\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.457687 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62n4\" (UniqueName: \"kubernetes.io/projected/e8e5185a-2d92-4429-927c-d7c9854f7bd1-kube-api-access-g62n4\") pod \"redhat-operators-7gq9q\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:20 crc kubenswrapper[4666]: I1203 13:47:20.541728 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:21 crc kubenswrapper[4666]: I1203 13:47:21.078940 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7gq9q"] Dec 03 13:47:21 crc kubenswrapper[4666]: I1203 13:47:21.453603 4666 generic.go:334] "Generic (PLEG): container finished" podID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerID="2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607" exitCode=0 Dec 03 13:47:21 crc kubenswrapper[4666]: I1203 13:47:21.453946 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7gq9q" event={"ID":"e8e5185a-2d92-4429-927c-d7c9854f7bd1","Type":"ContainerDied","Data":"2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607"} Dec 03 13:47:21 crc kubenswrapper[4666]: I1203 13:47:21.453980 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7gq9q" event={"ID":"e8e5185a-2d92-4429-927c-d7c9854f7bd1","Type":"ContainerStarted","Data":"4b481f1fabf2b8ae67e327bd334580285a7a160d02c749e2ec579e0592b0b2fc"} Dec 03 13:47:21 crc kubenswrapper[4666]: I1203 13:47:21.457466 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:47:23 crc kubenswrapper[4666]: I1203 13:47:23.483300 4666 generic.go:334] "Generic (PLEG): container finished" podID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerID="83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a" exitCode=0 Dec 03 13:47:23 crc kubenswrapper[4666]: I1203 13:47:23.484058 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7gq9q" event={"ID":"e8e5185a-2d92-4429-927c-d7c9854f7bd1","Type":"ContainerDied","Data":"83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a"} Dec 03 13:47:24 crc kubenswrapper[4666]: I1203 13:47:24.497551 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7gq9q" event={"ID":"e8e5185a-2d92-4429-927c-d7c9854f7bd1","Type":"ContainerStarted","Data":"5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3"} Dec 03 13:47:24 crc kubenswrapper[4666]: I1203 13:47:24.522918 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7gq9q" podStartSLOduration=1.952173377 podStartE2EDuration="4.522897286s" podCreationTimestamp="2025-12-03 13:47:20 +0000 UTC" firstStartedPulling="2025-12-03 13:47:21.457217904 +0000 UTC m=+5630.302178955" lastFinishedPulling="2025-12-03 13:47:24.027941803 +0000 UTC m=+5632.872902864" observedRunningTime="2025-12-03 13:47:24.52080918 +0000 UTC m=+5633.365770231" watchObservedRunningTime="2025-12-03 13:47:24.522897286 +0000 UTC m=+5633.367858337" Dec 03 13:47:30 crc kubenswrapper[4666]: I1203 13:47:30.542213 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:30 crc kubenswrapper[4666]: I1203 13:47:30.542798 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:30 crc kubenswrapper[4666]: I1203 13:47:30.600866 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:30 crc kubenswrapper[4666]: I1203 13:47:30.657589 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:30 crc kubenswrapper[4666]: I1203 13:47:30.844396 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7gq9q"] Dec 03 13:47:32 crc kubenswrapper[4666]: I1203 13:47:32.607515 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7gq9q" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="registry-server" containerID="cri-o://5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3" gracePeriod=2 Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.099561 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.193918 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62n4\" (UniqueName: \"kubernetes.io/projected/e8e5185a-2d92-4429-927c-d7c9854f7bd1-kube-api-access-g62n4\") pod \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.194180 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-catalog-content\") pod \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.194247 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-utilities\") pod \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\" (UID: \"e8e5185a-2d92-4429-927c-d7c9854f7bd1\") " Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.195313 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-utilities" (OuterVolumeSpecName: "utilities") pod "e8e5185a-2d92-4429-927c-d7c9854f7bd1" (UID: "e8e5185a-2d92-4429-927c-d7c9854f7bd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.200083 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e5185a-2d92-4429-927c-d7c9854f7bd1-kube-api-access-g62n4" (OuterVolumeSpecName: "kube-api-access-g62n4") pod "e8e5185a-2d92-4429-927c-d7c9854f7bd1" (UID: "e8e5185a-2d92-4429-927c-d7c9854f7bd1"). InnerVolumeSpecName "kube-api-access-g62n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.295859 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.295895 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62n4\" (UniqueName: \"kubernetes.io/projected/e8e5185a-2d92-4429-927c-d7c9854f7bd1-kube-api-access-g62n4\") on node \"crc\" DevicePath \"\"" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.620135 4666 generic.go:334] "Generic (PLEG): container finished" podID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerID="5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3" exitCode=0 Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.620227 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7gq9q" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.620263 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7gq9q" event={"ID":"e8e5185a-2d92-4429-927c-d7c9854f7bd1","Type":"ContainerDied","Data":"5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3"} Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.621273 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7gq9q" event={"ID":"e8e5185a-2d92-4429-927c-d7c9854f7bd1","Type":"ContainerDied","Data":"4b481f1fabf2b8ae67e327bd334580285a7a160d02c749e2ec579e0592b0b2fc"} Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.621319 4666 scope.go:117] "RemoveContainer" containerID="5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.650222 4666 scope.go:117] "RemoveContainer" containerID="83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.680279 4666 scope.go:117] "RemoveContainer" containerID="2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.756444 4666 scope.go:117] "RemoveContainer" containerID="5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3" Dec 03 13:47:33 crc kubenswrapper[4666]: E1203 13:47:33.757125 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3\": container with ID starting with 5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3 not found: ID does not exist" containerID="5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.757169 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3"} err="failed to get container status \"5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3\": rpc error: code = NotFound desc = could not find container \"5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3\": container with ID starting with 5cddf59c0a1d23766e88ea9e29ea268041f88172cda92809666cfe4a2657b8d3 not found: ID does not exist" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.757188 4666 scope.go:117] "RemoveContainer" containerID="83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a" Dec 03 13:47:33 crc kubenswrapper[4666]: E1203 13:47:33.757605 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a\": container with ID starting with 83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a not found: ID does not exist" containerID="83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.757685 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a"} err="failed to get container status \"83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a\": rpc error: code = NotFound desc = could not find container \"83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a\": container with ID starting with 83b8443bcde5bed9120e9f51e61c3d174eb05c5300017178312171197bc8dc5a not found: ID does not exist" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.757731 4666 scope.go:117] "RemoveContainer" containerID="2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607" Dec 03 13:47:33 crc kubenswrapper[4666]: E1203 13:47:33.758172 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607\": container with ID starting with 2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607 not found: ID does not exist" containerID="2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607" Dec 03 13:47:33 crc kubenswrapper[4666]: I1203 13:47:33.758201 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607"} err="failed to get container status \"2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607\": rpc error: code = NotFound desc = could not find container \"2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607\": container with ID starting with 2ac27cefbfbf61151802d6814c5c50b03899fe40f8ca8e790769f01eafecd607 not found: ID does not exist" Dec 03 13:47:34 crc kubenswrapper[4666]: I1203 13:47:34.595161 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8e5185a-2d92-4429-927c-d7c9854f7bd1" (UID: "e8e5185a-2d92-4429-927c-d7c9854f7bd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:47:34 crc kubenswrapper[4666]: I1203 13:47:34.621538 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e5185a-2d92-4429-927c-d7c9854f7bd1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:47:34 crc kubenswrapper[4666]: I1203 13:47:34.859226 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7gq9q"] Dec 03 13:47:34 crc kubenswrapper[4666]: I1203 13:47:34.869820 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7gq9q"] Dec 03 13:47:35 crc kubenswrapper[4666]: I1203 13:47:35.446413 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" path="/var/lib/kubelet/pods/e8e5185a-2d92-4429-927c-d7c9854f7bd1/volumes" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.619533 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-crktr/must-gather-psf59"] Dec 03 13:48:25 crc kubenswrapper[4666]: E1203 13:48:25.620413 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="extract-content" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.620426 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="extract-content" Dec 03 13:48:25 crc kubenswrapper[4666]: E1203 13:48:25.620445 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="extract-utilities" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.620453 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="extract-utilities" Dec 03 13:48:25 crc kubenswrapper[4666]: E1203 13:48:25.620462 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="registry-server" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.620468 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="registry-server" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.620656 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e5185a-2d92-4429-927c-d7c9854f7bd1" containerName="registry-server" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.621585 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.623825 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-crktr"/"openshift-service-ca.crt" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.624064 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-crktr"/"kube-root-ca.crt" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.624348 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-crktr"/"default-dockercfg-7g2l2" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.647873 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-crktr/must-gather-psf59"] Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.783954 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2vq\" (UniqueName: \"kubernetes.io/projected/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-kube-api-access-2x2vq\") pod \"must-gather-psf59\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.784001 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-must-gather-output\") pod \"must-gather-psf59\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.886250 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x2vq\" (UniqueName: \"kubernetes.io/projected/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-kube-api-access-2x2vq\") pod \"must-gather-psf59\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.886294 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-must-gather-output\") pod \"must-gather-psf59\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.886840 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-must-gather-output\") pod \"must-gather-psf59\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.906752 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x2vq\" (UniqueName: \"kubernetes.io/projected/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-kube-api-access-2x2vq\") pod \"must-gather-psf59\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:25 crc kubenswrapper[4666]: I1203 13:48:25.942151 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:48:26 crc kubenswrapper[4666]: I1203 13:48:26.419591 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-crktr/must-gather-psf59"] Dec 03 13:48:27 crc kubenswrapper[4666]: I1203 13:48:27.150368 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/must-gather-psf59" event={"ID":"7a392c6a-570e-4785-b3a2-e3bb2aa176f5","Type":"ContainerStarted","Data":"f0343179ed702fd94feb3cd00c9584fd2155535d64c8f4b82412df20725015ff"} Dec 03 13:48:27 crc kubenswrapper[4666]: I1203 13:48:27.150684 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/must-gather-psf59" event={"ID":"7a392c6a-570e-4785-b3a2-e3bb2aa176f5","Type":"ContainerStarted","Data":"fbf065bb3359a0290f48ca467acaa05eb19fb2ebda08ddbfbb48dbcdadb12650"} Dec 03 13:48:27 crc kubenswrapper[4666]: I1203 13:48:27.150696 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/must-gather-psf59" event={"ID":"7a392c6a-570e-4785-b3a2-e3bb2aa176f5","Type":"ContainerStarted","Data":"f83f89541e5ae35bb2c1624cf820ba585c053e54f6b873e85f1495dadae78297"} Dec 03 13:48:27 crc kubenswrapper[4666]: I1203 13:48:27.168608 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-crktr/must-gather-psf59" podStartSLOduration=2.1685920850000002 podStartE2EDuration="2.168592085s" podCreationTimestamp="2025-12-03 13:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:48:27.163280112 +0000 UTC m=+5696.008241153" watchObservedRunningTime="2025-12-03 13:48:27.168592085 +0000 UTC m=+5696.013553136" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.513130 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-crktr/crc-debug-4bdm7"] Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.515053 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.692274 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx2hp\" (UniqueName: \"kubernetes.io/projected/b8866cb3-c988-405f-abd5-0308064bf98d-kube-api-access-mx2hp\") pod \"crc-debug-4bdm7\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.692357 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8866cb3-c988-405f-abd5-0308064bf98d-host\") pod \"crc-debug-4bdm7\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.794546 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx2hp\" (UniqueName: \"kubernetes.io/projected/b8866cb3-c988-405f-abd5-0308064bf98d-kube-api-access-mx2hp\") pod \"crc-debug-4bdm7\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.794667 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8866cb3-c988-405f-abd5-0308064bf98d-host\") pod \"crc-debug-4bdm7\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.794810 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8866cb3-c988-405f-abd5-0308064bf98d-host\") pod \"crc-debug-4bdm7\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.823688 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx2hp\" (UniqueName: \"kubernetes.io/projected/b8866cb3-c988-405f-abd5-0308064bf98d-kube-api-access-mx2hp\") pod \"crc-debug-4bdm7\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: I1203 13:48:30.834613 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:48:30 crc kubenswrapper[4666]: W1203 13:48:30.878467 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8866cb3_c988_405f_abd5_0308064bf98d.slice/crio-23410c2e3e852cd027efe6bf6d9aba8de920fdebf69a4a341b5f3089379b51f8 WatchSource:0}: Error finding container 23410c2e3e852cd027efe6bf6d9aba8de920fdebf69a4a341b5f3089379b51f8: Status 404 returned error can't find the container with id 23410c2e3e852cd027efe6bf6d9aba8de920fdebf69a4a341b5f3089379b51f8 Dec 03 13:48:31 crc kubenswrapper[4666]: I1203 13:48:31.190523 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-4bdm7" event={"ID":"b8866cb3-c988-405f-abd5-0308064bf98d","Type":"ContainerStarted","Data":"e26b865d275e439408b805d25ff97f506eb39acc7090a9e263fd6b0c197a21f8"} Dec 03 13:48:31 crc kubenswrapper[4666]: I1203 13:48:31.190949 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-4bdm7" event={"ID":"b8866cb3-c988-405f-abd5-0308064bf98d","Type":"ContainerStarted","Data":"23410c2e3e852cd027efe6bf6d9aba8de920fdebf69a4a341b5f3089379b51f8"} Dec 03 13:48:31 crc kubenswrapper[4666]: I1203 13:48:31.231651 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-crktr/crc-debug-4bdm7" podStartSLOduration=1.231636987 podStartE2EDuration="1.231636987s" podCreationTimestamp="2025-12-03 13:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:48:31.217751034 +0000 UTC m=+5700.062712095" watchObservedRunningTime="2025-12-03 13:48:31.231636987 +0000 UTC m=+5700.076598028" Dec 03 13:49:09 crc kubenswrapper[4666]: I1203 13:49:09.866330 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:49:09 crc kubenswrapper[4666]: I1203 13:49:09.866912 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:49:10 crc kubenswrapper[4666]: I1203 13:49:10.547372 4666 generic.go:334] "Generic (PLEG): container finished" podID="b8866cb3-c988-405f-abd5-0308064bf98d" containerID="e26b865d275e439408b805d25ff97f506eb39acc7090a9e263fd6b0c197a21f8" exitCode=0 Dec 03 13:49:10 crc kubenswrapper[4666]: I1203 13:49:10.547475 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-4bdm7" event={"ID":"b8866cb3-c988-405f-abd5-0308064bf98d","Type":"ContainerDied","Data":"e26b865d275e439408b805d25ff97f506eb39acc7090a9e263fd6b0c197a21f8"} Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.664147 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.698209 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-crktr/crc-debug-4bdm7"] Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.706875 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-crktr/crc-debug-4bdm7"] Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.820198 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx2hp\" (UniqueName: \"kubernetes.io/projected/b8866cb3-c988-405f-abd5-0308064bf98d-kube-api-access-mx2hp\") pod \"b8866cb3-c988-405f-abd5-0308064bf98d\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.820256 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8866cb3-c988-405f-abd5-0308064bf98d-host\") pod \"b8866cb3-c988-405f-abd5-0308064bf98d\" (UID: \"b8866cb3-c988-405f-abd5-0308064bf98d\") " Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.820394 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8866cb3-c988-405f-abd5-0308064bf98d-host" (OuterVolumeSpecName: "host") pod "b8866cb3-c988-405f-abd5-0308064bf98d" (UID: "b8866cb3-c988-405f-abd5-0308064bf98d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.820842 4666 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8866cb3-c988-405f-abd5-0308064bf98d-host\") on node \"crc\" DevicePath \"\"" Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.829436 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8866cb3-c988-405f-abd5-0308064bf98d-kube-api-access-mx2hp" (OuterVolumeSpecName: "kube-api-access-mx2hp") pod "b8866cb3-c988-405f-abd5-0308064bf98d" (UID: "b8866cb3-c988-405f-abd5-0308064bf98d"). InnerVolumeSpecName "kube-api-access-mx2hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:49:11 crc kubenswrapper[4666]: I1203 13:49:11.922657 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx2hp\" (UniqueName: \"kubernetes.io/projected/b8866cb3-c988-405f-abd5-0308064bf98d-kube-api-access-mx2hp\") on node \"crc\" DevicePath \"\"" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.566615 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23410c2e3e852cd027efe6bf6d9aba8de920fdebf69a4a341b5f3089379b51f8" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.566698 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-4bdm7" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.861611 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-crktr/crc-debug-mxsbg"] Dec 03 13:49:12 crc kubenswrapper[4666]: E1203 13:49:12.862984 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8866cb3-c988-405f-abd5-0308064bf98d" containerName="container-00" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.863009 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8866cb3-c988-405f-abd5-0308064bf98d" containerName="container-00" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.863246 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8866cb3-c988-405f-abd5-0308064bf98d" containerName="container-00" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.864045 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.941695 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9r8\" (UniqueName: \"kubernetes.io/projected/32bf721c-547f-43c2-b8f3-99232ba8ed12-kube-api-access-7n9r8\") pod \"crc-debug-mxsbg\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:12 crc kubenswrapper[4666]: I1203 13:49:12.941794 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bf721c-547f-43c2-b8f3-99232ba8ed12-host\") pod \"crc-debug-mxsbg\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.043718 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bf721c-547f-43c2-b8f3-99232ba8ed12-host\") pod \"crc-debug-mxsbg\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.043873 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bf721c-547f-43c2-b8f3-99232ba8ed12-host\") pod \"crc-debug-mxsbg\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.043877 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n9r8\" (UniqueName: \"kubernetes.io/projected/32bf721c-547f-43c2-b8f3-99232ba8ed12-kube-api-access-7n9r8\") pod \"crc-debug-mxsbg\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.061855 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n9r8\" (UniqueName: \"kubernetes.io/projected/32bf721c-547f-43c2-b8f3-99232ba8ed12-kube-api-access-7n9r8\") pod \"crc-debug-mxsbg\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.184750 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.437605 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8866cb3-c988-405f-abd5-0308064bf98d" path="/var/lib/kubelet/pods/b8866cb3-c988-405f-abd5-0308064bf98d/volumes" Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.576592 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-mxsbg" event={"ID":"32bf721c-547f-43c2-b8f3-99232ba8ed12","Type":"ContainerStarted","Data":"a563c3304661aae27601b23cfd23a903fba22ca3740082cf84bde2f9f5a51b66"} Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.576635 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-mxsbg" event={"ID":"32bf721c-547f-43c2-b8f3-99232ba8ed12","Type":"ContainerStarted","Data":"aba7c08a7a8a0da9a9e90a0f7c7f4a7c495635956974c478288ef896b2f59595"} Dec 03 13:49:13 crc kubenswrapper[4666]: I1203 13:49:13.599567 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-crktr/crc-debug-mxsbg" podStartSLOduration=1.599542842 podStartE2EDuration="1.599542842s" podCreationTimestamp="2025-12-03 13:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 13:49:13.589655167 +0000 UTC m=+5742.434616218" watchObservedRunningTime="2025-12-03 13:49:13.599542842 +0000 UTC m=+5742.444503903" Dec 03 13:49:14 crc kubenswrapper[4666]: I1203 13:49:14.591339 4666 generic.go:334] "Generic (PLEG): container finished" podID="32bf721c-547f-43c2-b8f3-99232ba8ed12" containerID="a563c3304661aae27601b23cfd23a903fba22ca3740082cf84bde2f9f5a51b66" exitCode=0 Dec 03 13:49:14 crc kubenswrapper[4666]: I1203 13:49:14.591421 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-mxsbg" event={"ID":"32bf721c-547f-43c2-b8f3-99232ba8ed12","Type":"ContainerDied","Data":"a563c3304661aae27601b23cfd23a903fba22ca3740082cf84bde2f9f5a51b66"} Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.712028 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.745549 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-crktr/crc-debug-mxsbg"] Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.754958 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-crktr/crc-debug-mxsbg"] Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.893735 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bf721c-547f-43c2-b8f3-99232ba8ed12-host\") pod \"32bf721c-547f-43c2-b8f3-99232ba8ed12\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.893804 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n9r8\" (UniqueName: \"kubernetes.io/projected/32bf721c-547f-43c2-b8f3-99232ba8ed12-kube-api-access-7n9r8\") pod \"32bf721c-547f-43c2-b8f3-99232ba8ed12\" (UID: \"32bf721c-547f-43c2-b8f3-99232ba8ed12\") " Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.893817 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32bf721c-547f-43c2-b8f3-99232ba8ed12-host" (OuterVolumeSpecName: "host") pod "32bf721c-547f-43c2-b8f3-99232ba8ed12" (UID: "32bf721c-547f-43c2-b8f3-99232ba8ed12"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.894260 4666 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bf721c-547f-43c2-b8f3-99232ba8ed12-host\") on node \"crc\" DevicePath \"\"" Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.899617 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bf721c-547f-43c2-b8f3-99232ba8ed12-kube-api-access-7n9r8" (OuterVolumeSpecName: "kube-api-access-7n9r8") pod "32bf721c-547f-43c2-b8f3-99232ba8ed12" (UID: "32bf721c-547f-43c2-b8f3-99232ba8ed12"). InnerVolumeSpecName "kube-api-access-7n9r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:49:15 crc kubenswrapper[4666]: I1203 13:49:15.996523 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n9r8\" (UniqueName: \"kubernetes.io/projected/32bf721c-547f-43c2-b8f3-99232ba8ed12-kube-api-access-7n9r8\") on node \"crc\" DevicePath \"\"" Dec 03 13:49:16 crc kubenswrapper[4666]: I1203 13:49:16.612624 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba7c08a7a8a0da9a9e90a0f7c7f4a7c495635956974c478288ef896b2f59595" Dec 03 13:49:16 crc kubenswrapper[4666]: I1203 13:49:16.612770 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-mxsbg" Dec 03 13:49:16 crc kubenswrapper[4666]: I1203 13:49:16.924816 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-crktr/crc-debug-9k8r6"] Dec 03 13:49:16 crc kubenswrapper[4666]: E1203 13:49:16.925301 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bf721c-547f-43c2-b8f3-99232ba8ed12" containerName="container-00" Dec 03 13:49:16 crc kubenswrapper[4666]: I1203 13:49:16.925318 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bf721c-547f-43c2-b8f3-99232ba8ed12" containerName="container-00" Dec 03 13:49:16 crc kubenswrapper[4666]: I1203 13:49:16.925510 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bf721c-547f-43c2-b8f3-99232ba8ed12" containerName="container-00" Dec 03 13:49:16 crc kubenswrapper[4666]: I1203 13:49:16.926690 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.118528 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af7b87e3-18af-42e1-ae0f-898aef4b8223-host\") pod \"crc-debug-9k8r6\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.118696 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t2kr\" (UniqueName: \"kubernetes.io/projected/af7b87e3-18af-42e1-ae0f-898aef4b8223-kube-api-access-8t2kr\") pod \"crc-debug-9k8r6\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.220457 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af7b87e3-18af-42e1-ae0f-898aef4b8223-host\") pod \"crc-debug-9k8r6\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.220523 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af7b87e3-18af-42e1-ae0f-898aef4b8223-host\") pod \"crc-debug-9k8r6\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.220543 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t2kr\" (UniqueName: \"kubernetes.io/projected/af7b87e3-18af-42e1-ae0f-898aef4b8223-kube-api-access-8t2kr\") pod \"crc-debug-9k8r6\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.241730 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t2kr\" (UniqueName: \"kubernetes.io/projected/af7b87e3-18af-42e1-ae0f-898aef4b8223-kube-api-access-8t2kr\") pod \"crc-debug-9k8r6\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.245524 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:17 crc kubenswrapper[4666]: W1203 13:49:17.273898 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf7b87e3_18af_42e1_ae0f_898aef4b8223.slice/crio-72d402d1edbb5f0573c4511aca063873d2dd81c16005fa14e2fb858512328e1c WatchSource:0}: Error finding container 72d402d1edbb5f0573c4511aca063873d2dd81c16005fa14e2fb858512328e1c: Status 404 returned error can't find the container with id 72d402d1edbb5f0573c4511aca063873d2dd81c16005fa14e2fb858512328e1c Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.444950 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bf721c-547f-43c2-b8f3-99232ba8ed12" path="/var/lib/kubelet/pods/32bf721c-547f-43c2-b8f3-99232ba8ed12/volumes" Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.621139 4666 generic.go:334] "Generic (PLEG): container finished" podID="af7b87e3-18af-42e1-ae0f-898aef4b8223" containerID="ee1f95795f674e89d44d15aeb5a289fc7a272d69578f22af38af4ec1bbb5e0a6" exitCode=0 Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.621414 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-9k8r6" event={"ID":"af7b87e3-18af-42e1-ae0f-898aef4b8223","Type":"ContainerDied","Data":"ee1f95795f674e89d44d15aeb5a289fc7a272d69578f22af38af4ec1bbb5e0a6"} Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.621440 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/crc-debug-9k8r6" event={"ID":"af7b87e3-18af-42e1-ae0f-898aef4b8223","Type":"ContainerStarted","Data":"72d402d1edbb5f0573c4511aca063873d2dd81c16005fa14e2fb858512328e1c"} Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.664563 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-crktr/crc-debug-9k8r6"] Dec 03 13:49:17 crc kubenswrapper[4666]: I1203 13:49:17.672395 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-crktr/crc-debug-9k8r6"] Dec 03 13:49:18 crc kubenswrapper[4666]: I1203 13:49:18.730544 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:18 crc kubenswrapper[4666]: I1203 13:49:18.853433 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t2kr\" (UniqueName: \"kubernetes.io/projected/af7b87e3-18af-42e1-ae0f-898aef4b8223-kube-api-access-8t2kr\") pod \"af7b87e3-18af-42e1-ae0f-898aef4b8223\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " Dec 03 13:49:18 crc kubenswrapper[4666]: I1203 13:49:18.853551 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af7b87e3-18af-42e1-ae0f-898aef4b8223-host\") pod \"af7b87e3-18af-42e1-ae0f-898aef4b8223\" (UID: \"af7b87e3-18af-42e1-ae0f-898aef4b8223\") " Dec 03 13:49:18 crc kubenswrapper[4666]: I1203 13:49:18.853637 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af7b87e3-18af-42e1-ae0f-898aef4b8223-host" (OuterVolumeSpecName: "host") pod "af7b87e3-18af-42e1-ae0f-898aef4b8223" (UID: "af7b87e3-18af-42e1-ae0f-898aef4b8223"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 13:49:18 crc kubenswrapper[4666]: I1203 13:49:18.853969 4666 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af7b87e3-18af-42e1-ae0f-898aef4b8223-host\") on node \"crc\" DevicePath \"\"" Dec 03 13:49:18 crc kubenswrapper[4666]: I1203 13:49:18.860769 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7b87e3-18af-42e1-ae0f-898aef4b8223-kube-api-access-8t2kr" (OuterVolumeSpecName: "kube-api-access-8t2kr") pod "af7b87e3-18af-42e1-ae0f-898aef4b8223" (UID: "af7b87e3-18af-42e1-ae0f-898aef4b8223"). InnerVolumeSpecName "kube-api-access-8t2kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:49:18 crc kubenswrapper[4666]: I1203 13:49:18.955969 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t2kr\" (UniqueName: \"kubernetes.io/projected/af7b87e3-18af-42e1-ae0f-898aef4b8223-kube-api-access-8t2kr\") on node \"crc\" DevicePath \"\"" Dec 03 13:49:19 crc kubenswrapper[4666]: I1203 13:49:19.435077 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af7b87e3-18af-42e1-ae0f-898aef4b8223" path="/var/lib/kubelet/pods/af7b87e3-18af-42e1-ae0f-898aef4b8223/volumes" Dec 03 13:49:19 crc kubenswrapper[4666]: I1203 13:49:19.640132 4666 scope.go:117] "RemoveContainer" containerID="ee1f95795f674e89d44d15aeb5a289fc7a272d69578f22af38af4ec1bbb5e0a6" Dec 03 13:49:19 crc kubenswrapper[4666]: I1203 13:49:19.640832 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/crc-debug-9k8r6" Dec 03 13:49:39 crc kubenswrapper[4666]: I1203 13:49:39.866063 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:49:39 crc kubenswrapper[4666]: I1203 13:49:39.866511 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:50:09 crc kubenswrapper[4666]: I1203 13:50:09.866468 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:50:09 crc kubenswrapper[4666]: I1203 13:50:09.868899 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:50:09 crc kubenswrapper[4666]: I1203 13:50:09.869163 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:50:09 crc kubenswrapper[4666]: I1203 13:50:09.870319 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9cc00e19a46e3efb1f89908932881acb239a1fb796cadd29fdfb9a733ca6dd2"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:50:09 crc kubenswrapper[4666]: I1203 13:50:09.870559 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://a9cc00e19a46e3efb1f89908932881acb239a1fb796cadd29fdfb9a733ca6dd2" gracePeriod=600 Dec 03 13:50:10 crc kubenswrapper[4666]: I1203 13:50:10.091016 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="a9cc00e19a46e3efb1f89908932881acb239a1fb796cadd29fdfb9a733ca6dd2" exitCode=0 Dec 03 13:50:10 crc kubenswrapper[4666]: I1203 13:50:10.091065 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"a9cc00e19a46e3efb1f89908932881acb239a1fb796cadd29fdfb9a733ca6dd2"} Dec 03 13:50:10 crc kubenswrapper[4666]: I1203 13:50:10.091127 4666 scope.go:117] "RemoveContainer" containerID="70df592f8d397cfda8668f282fa07f12a851c2dc92dfb339da8a251795adc12c" Dec 03 13:50:11 crc kubenswrapper[4666]: I1203 13:50:11.101962 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820"} Dec 03 13:50:12 crc kubenswrapper[4666]: I1203 13:50:12.705163 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d6d7b74fd-t9f48_e1a5016b-ec1e-485e-bedf-ced8377f2aae/barbican-api/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.021533 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d6d7b74fd-t9f48_e1a5016b-ec1e-485e-bedf-ced8377f2aae/barbican-api-log/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.048436 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64566d6d86-kczk4_b0043f11-1613-418f-9974-e88038dd7e5e/barbican-keystone-listener/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.261549 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f7f466d4c-4ps5s_7dde289d-753b-4a00-8863-b671281a0bef/barbican-worker/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.309017 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64566d6d86-kczk4_b0043f11-1613-418f-9974-e88038dd7e5e/barbican-keystone-listener-log/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.346296 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f7f466d4c-4ps5s_7dde289d-753b-4a00-8863-b671281a0bef/barbican-worker-log/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.507214 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zmthj_1bcf0d09-1d7c-4a79-a477-f10b1584bc42/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.553354 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/ceilometer-central-agent/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.723789 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/proxy-httpd/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.736134 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/ceilometer-notification-agent/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.748287 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cad8148a-80c4-407d-a0e0-6fd679f60f89/sg-core/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.932502 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w89rf_01ad9f92-ff9d-4a4b-ba01-4d64dc20a1d6/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:13 crc kubenswrapper[4666]: I1203 13:50:13.971411 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-29td7_0d695826-87fe-4625-9925-988306a9e16b/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.165992 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_58b62594-91e8-4cc7-8076-094fba5bcc66/cinder-api-log/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.180404 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_58b62594-91e8-4cc7-8076-094fba5bcc66/cinder-api/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.414699 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_37430c8d-6678-44c5-a349-8cb94fbb9108/probe/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.549620 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b7e78364-3d2e-435a-a0fb-d85cb2586006/cinder-scheduler/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.589198 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_37430c8d-6678-44c5-a349-8cb94fbb9108/cinder-backup/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.677558 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b7e78364-3d2e-435a-a0fb-d85cb2586006/probe/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.818009 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_978def64-fbe5-4ce2-a2ab-f12bd95ef64a/probe/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.850505 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_978def64-fbe5-4ce2-a2ab-f12bd95ef64a/cinder-volume/0.log" Dec 03 13:50:14 crc kubenswrapper[4666]: I1203 13:50:14.904262 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4l5bs_55fa6ece-d9b6-4ccb-a08f-82e7e0d6d9e4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.056754 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2hmvx_49d2d5f0-5c89-4847-856e-cf9ed17510ec/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.178999 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wjfgn_a761ff17-0cbb-43ca-83bc-5fb2b684203f/init/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.279177 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wjfgn_a761ff17-0cbb-43ca-83bc-5fb2b684203f/init/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.327278 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wjfgn_a761ff17-0cbb-43ca-83bc-5fb2b684203f/dnsmasq-dns/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.410855 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f0db6845-8502-4fa7-acdf-20c2395ca177/glance-httpd/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.492249 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f0db6845-8502-4fa7-acdf-20c2395ca177/glance-log/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.566463 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c8c0a38f-bf71-4907-ba78-bef2e7227dc6/glance-httpd/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.602728 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c8c0a38f-bf71-4907-ba78-bef2e7227dc6/glance-log/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.864107 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bd58698c4-v4zw4_a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd/horizon/0.log" Dec 03 13:50:15 crc kubenswrapper[4666]: I1203 13:50:15.952868 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8g5v8_7a99d58b-e139-49f4-8689-faeb388b82ff/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:16 crc kubenswrapper[4666]: I1203 13:50:16.030557 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bd58698c4-v4zw4_a1ad5bab-5f58-4490-b2ee-e0f566b8d6dd/horizon-log/0.log" Dec 03 13:50:16 crc kubenswrapper[4666]: I1203 13:50:16.065592 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-r2rn2_19109296-aca4-46fa-95d5-70dcd8604ab7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:16 crc kubenswrapper[4666]: I1203 13:50:16.395698 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412781-4pd56_867a9f13-2579-4e34-9c29-97847041400d/keystone-cron/0.log" Dec 03 13:50:16 crc kubenswrapper[4666]: I1203 13:50:16.505135 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d2050e43-459e-42d5-ae48-1e8e03dd089f/kube-state-metrics/0.log" Dec 03 13:50:16 crc kubenswrapper[4666]: I1203 13:50:16.698254 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kxg6v_a165368e-be15-48d7-afad-92850b6844ea/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.162711 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1aee1c1a-28b6-4db6-b927-a484fa641914/probe/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.247689 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1aee1c1a-28b6-4db6-b927-a484fa641914/manila-scheduler/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.250057 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5fdb97596b-722zc_fad25ce8-9656-42a7-bc6a-369e68732b1e/keystone-api/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.270455 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8015a12c-752c-489f-a52f-da3bf0ab2977/manila-api/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.483451 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_11798731-763e-4a1e-97dc-54a4ff717ddf/probe/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.726527 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8015a12c-752c-489f-a52f-da3bf0ab2977/manila-api-log/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.752542 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_11798731-763e-4a1e-97dc-54a4ff717ddf/manila-share/0.log" Dec 03 13:50:17 crc kubenswrapper[4666]: I1203 13:50:17.867522 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-96d8bfbbf-pd9x2_c6fc5a47-ba09-4985-b0d7-26d824dd60e3/neutron-api/0.log" Dec 03 13:50:18 crc kubenswrapper[4666]: I1203 13:50:18.021569 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-cknnk_109066d4-b3b2-4ec6-ba71-cfc35d9ca300/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:18 crc kubenswrapper[4666]: I1203 13:50:18.028001 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-96d8bfbbf-pd9x2_c6fc5a47-ba09-4985-b0d7-26d824dd60e3/neutron-httpd/0.log" Dec 03 13:50:18 crc kubenswrapper[4666]: I1203 13:50:18.449663 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d91a5463-a0cd-40be-90b0-e01d8f1ebdf3/nova-api-log/0.log" Dec 03 13:50:18 crc kubenswrapper[4666]: I1203 13:50:18.618985 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_aba61f3d-9288-44a6-b194-c136cc1bda0a/nova-cell0-conductor-conductor/0.log" Dec 03 13:50:18 crc kubenswrapper[4666]: I1203 13:50:18.855669 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_31efcd1f-210a-408a-b711-6862b6537a7d/nova-cell1-conductor-conductor/0.log" Dec 03 13:50:18 crc kubenswrapper[4666]: I1203 13:50:18.986890 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d91a5463-a0cd-40be-90b0-e01d8f1ebdf3/nova-api-api/0.log" Dec 03 13:50:19 crc kubenswrapper[4666]: I1203 13:50:19.002238 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_22c93a41-9194-4d6c-a77a-3310870cb513/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 13:50:19 crc kubenswrapper[4666]: I1203 13:50:19.088296 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-t5cm7_dfec4a43-8c2c-4dab-b3c1-2bc56e71d330/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:19 crc kubenswrapper[4666]: I1203 13:50:19.347181 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8ae0d3e5-4249-417a-aac2-5280115b1213/nova-metadata-log/0.log" Dec 03 13:50:19 crc kubenswrapper[4666]: I1203 13:50:19.585613 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_2a6eed91-42df-4c6b-aaa0-7882ecfb941a/nova-scheduler-scheduler/0.log" Dec 03 13:50:19 crc kubenswrapper[4666]: I1203 13:50:19.717301 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d17da10a-51b4-4ed7-8bbc-2b37be248419/mysql-bootstrap/0.log" Dec 03 13:50:19 crc kubenswrapper[4666]: I1203 13:50:19.857629 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d17da10a-51b4-4ed7-8bbc-2b37be248419/galera/0.log" Dec 03 13:50:19 crc kubenswrapper[4666]: I1203 13:50:19.952779 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d17da10a-51b4-4ed7-8bbc-2b37be248419/mysql-bootstrap/0.log" Dec 03 13:50:20 crc kubenswrapper[4666]: I1203 13:50:20.071204 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_43a43416-3214-46ba-8a00-6939bb265c8a/mysql-bootstrap/0.log" Dec 03 13:50:20 crc kubenswrapper[4666]: I1203 13:50:20.357999 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_43a43416-3214-46ba-8a00-6939bb265c8a/mysql-bootstrap/0.log" Dec 03 13:50:20 crc kubenswrapper[4666]: I1203 13:50:20.377677 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_43a43416-3214-46ba-8a00-6939bb265c8a/galera/0.log" Dec 03 13:50:20 crc kubenswrapper[4666]: I1203 13:50:20.552076 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a7c6b242-ba03-4e43-9061-e908c5af1c78/openstackclient/0.log" Dec 03 13:50:20 crc kubenswrapper[4666]: I1203 13:50:20.645247 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l9wjt_844dc007-fbd5-4ca4-9f2f-dc3f2382a653/openstack-network-exporter/0.log" Dec 03 13:50:20 crc kubenswrapper[4666]: I1203 13:50:20.892323 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nsf9r_e0d2b1d4-5613-4ee7-a95d-ba0bc47b8342/ovn-controller/0.log" Dec 03 13:50:21 crc kubenswrapper[4666]: I1203 13:50:21.113026 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovsdb-server-init/0.log" Dec 03 13:50:21 crc kubenswrapper[4666]: I1203 13:50:21.264506 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8ae0d3e5-4249-417a-aac2-5280115b1213/nova-metadata-metadata/0.log" Dec 03 13:50:21 crc kubenswrapper[4666]: I1203 13:50:21.328632 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovs-vswitchd/0.log" Dec 03 13:50:21 crc kubenswrapper[4666]: I1203 13:50:21.341923 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovsdb-server-init/0.log" Dec 03 13:50:21 crc kubenswrapper[4666]: I1203 13:50:21.446115 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mdtd4_2d005e60-fcd2-4546-a783-e4770dd9e1d5/ovsdb-server/0.log" Dec 03 13:50:21 crc kubenswrapper[4666]: I1203 13:50:21.593040 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-8bt2q_1f8dd079-749d-4ad3-8365-eb026d693512/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:21 crc kubenswrapper[4666]: I1203 13:50:21.910232 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8e9fa52f-aa3b-4705-8a71-e48befc92571/openstack-network-exporter/0.log" Dec 03 13:50:22 crc kubenswrapper[4666]: I1203 13:50:22.026423 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8e9fa52f-aa3b-4705-8a71-e48befc92571/ovn-northd/0.log" Dec 03 13:50:22 crc kubenswrapper[4666]: I1203 13:50:22.193532 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55083d6a-bded-48e2-a0ce-3befa24ce873/openstack-network-exporter/0.log" Dec 03 13:50:22 crc kubenswrapper[4666]: I1203 13:50:22.569276 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bdd6a24-e604-459e-8eba-ea0d2638fdf5/openstack-network-exporter/0.log" Dec 03 13:50:22 crc kubenswrapper[4666]: I1203 13:50:22.790660 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55083d6a-bded-48e2-a0ce-3befa24ce873/ovsdbserver-nb/0.log" Dec 03 13:50:22 crc kubenswrapper[4666]: I1203 13:50:22.838560 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bdd6a24-e604-459e-8eba-ea0d2638fdf5/ovsdbserver-sb/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.030781 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9676dbdb4-pcj6f_08966743-608f-40d3-9a26-2515ef964f0f/placement-api/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.164244 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_20f2c961-32c5-4a6e-8d18-5296c889d4a3/setup-container/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.164820 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9676dbdb4-pcj6f_08966743-608f-40d3-9a26-2515ef964f0f/placement-log/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.417906 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a39a37c-b566-4726-8e2b-84be35262830/setup-container/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.438323 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_20f2c961-32c5-4a6e-8d18-5296c889d4a3/setup-container/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.516360 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_20f2c961-32c5-4a6e-8d18-5296c889d4a3/rabbitmq/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.715837 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a39a37c-b566-4726-8e2b-84be35262830/setup-container/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.773307 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a39a37c-b566-4726-8e2b-84be35262830/rabbitmq/0.log" Dec 03 13:50:23 crc kubenswrapper[4666]: I1203 13:50:23.784257 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fr7xd_a854a0c8-aad2-4681-9077-c8abd034fa73/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:24 crc kubenswrapper[4666]: I1203 13:50:24.003186 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-flnhs_434bedfb-c0c1-45d4-ade6-8e5112122e58/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:24 crc kubenswrapper[4666]: I1203 13:50:24.034237 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-4cl4j_de91e472-2cc8-4eaf-91a3-49719f18e3f3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:24 crc kubenswrapper[4666]: I1203 13:50:24.231169 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-npxb5_15e79024-d1c5-4689-900b-92ded975568d/ssh-known-hosts-edpm-deployment/0.log" Dec 03 13:50:24 crc kubenswrapper[4666]: I1203 13:50:24.303960 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_33fe3655-5d2c-48bd-8a4b-f436570d149c/tempest-tests-tempest-tests-runner/0.log" Dec 03 13:50:24 crc kubenswrapper[4666]: I1203 13:50:24.443045 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_57ab82ad-c603-403b-8a3b-9e32e2ffc1ea/test-operator-logs-container/0.log" Dec 03 13:50:24 crc kubenswrapper[4666]: I1203 13:50:24.568881 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9drzh_acf58997-af21-4832-a74c-f81057c84d08/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 13:50:40 crc kubenswrapper[4666]: I1203 13:50:40.781376 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_e4cbcde8-77f8-47d9-b6c6-c52f5daf4f53/memcached/0.log" Dec 03 13:50:51 crc kubenswrapper[4666]: I1203 13:50:51.777673 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rp8qs_72fef244-af95-4c84-889b-04317e2f85e4/kube-rbac-proxy/0.log" Dec 03 13:50:51 crc kubenswrapper[4666]: I1203 13:50:51.813167 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rp8qs_72fef244-af95-4c84-889b-04317e2f85e4/manager/0.log" Dec 03 13:50:51 crc kubenswrapper[4666]: I1203 13:50:51.967982 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-c4plc_75171f12-3098-437a-a941-31312676f362/kube-rbac-proxy/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.049046 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-c4plc_75171f12-3098-437a-a941-31312676f362/manager/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.171480 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-jzzkf_21ae197d-ae5d-4129-b1db-114a42dc5eb8/kube-rbac-proxy/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.186995 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-jzzkf_21ae197d-ae5d-4129-b1db-114a42dc5eb8/manager/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.221426 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/util/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.376624 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/pull/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.404901 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/pull/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.622778 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/pull/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.627468 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/util/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.632477 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/util/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.638611 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_febeaa8901868058bbd01f17d7045c53a183b798c937365d541a91fcfall4b6_9ad938cb-69f7-46d8-9831-21332a984dfc/extract/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.839705 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-j6dh9_6b5f798a-8be3-4c12-948b-4b9ff35d14ba/manager/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.845008 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-j6dh9_6b5f798a-8be3-4c12-948b-4b9ff35d14ba/kube-rbac-proxy/0.log" Dec 03 13:50:52 crc kubenswrapper[4666]: I1203 13:50:52.995499 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9cbjr_330ae135-611a-4ae6-ba73-fcb6a911c299/kube-rbac-proxy/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.011284 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dn9wg_1d9a62f9-0c20-4033-84d4-ade04922d04a/kube-rbac-proxy/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.012487 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9cbjr_330ae135-611a-4ae6-ba73-fcb6a911c299/manager/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.153044 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dn9wg_1d9a62f9-0c20-4033-84d4-ade04922d04a/manager/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.189586 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-vxrg7_e9197948-361b-43e7-8cc6-db509c80c7b1/kube-rbac-proxy/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.411361 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-97svz_dce8c65f-3951-4e68-a044-c4c59638fd05/kube-rbac-proxy/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.427707 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-vxrg7_e9197948-361b-43e7-8cc6-db509c80c7b1/manager/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.477181 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-97svz_dce8c65f-3951-4e68-a044-c4c59638fd05/manager/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.598893 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-n287g_b3582f8c-2777-4291-bc6a-42953fd2d928/kube-rbac-proxy/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.653947 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-n287g_b3582f8c-2777-4291-bc6a-42953fd2d928/manager/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.746175 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5797d476c-ntgb9_e4686a7d-808f-47e8-b5cd-ec3af299a7f2/kube-rbac-proxy/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.805862 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5797d476c-ntgb9_e4686a7d-808f-47e8-b5cd-ec3af299a7f2/manager/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.871719 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7lkjq_78813232-79b6-4483-86cb-069995914531/kube-rbac-proxy/0.log" Dec 03 13:50:53 crc kubenswrapper[4666]: I1203 13:50:53.956528 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-7lkjq_78813232-79b6-4483-86cb-069995914531/manager/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.084328 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-khplb_8048d3e0-a035-4a85-92ad-ca11dc24ccbe/kube-rbac-proxy/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.092757 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-khplb_8048d3e0-a035-4a85-92ad-ca11dc24ccbe/manager/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.227207 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ckxvk_d90e913f-9878-4644-b0f7-d0e313b8f897/kube-rbac-proxy/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.338644 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ckxvk_d90e913f-9878-4644-b0f7-d0e313b8f897/manager/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.419757 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-q5d7b_5a914e37-4302-4c77-8d4b-6c509dfbfc4e/manager/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.421722 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-q5d7b_5a914e37-4302-4c77-8d4b-6c509dfbfc4e/kube-rbac-proxy/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.568275 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt_865a9d83-50b6-49fb-87f8-c46fa1453ed0/kube-rbac-proxy/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.602568 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t2wqt_865a9d83-50b6-49fb-87f8-c46fa1453ed0/manager/0.log" Dec 03 13:50:54 crc kubenswrapper[4666]: I1203 13:50:54.959731 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f9cd9598-chsns_7aa8983e-49b3-4356-aae3-5388d37ae886/operator/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.018203 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-z7hzf_04142222-9a39-4c8f-81b1-df4035625463/registry-server/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.184029 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-rt2zn_1d364f72-b379-4591-b3f4-17997cbcba6e/kube-rbac-proxy/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.366644 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-rt2zn_1d364f72-b379-4591-b3f4-17997cbcba6e/manager/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.378245 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bwxhd_deefb3d8-d96a-4e86-839d-8d8a561f4645/kube-rbac-proxy/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.429051 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bwxhd_deefb3d8-d96a-4e86-839d-8d8a561f4645/manager/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.603224 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-d49kn_e0637cb9-5703-4e26-b526-592b818a5304/operator/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.679131 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-x2zqw_20594b02-a42f-4747-abfc-cbee34847d81/kube-rbac-proxy/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.822352 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-x2zqw_20594b02-a42f-4747-abfc-cbee34847d81/manager/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.878187 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-bcvbt_494c67d4-f61e-468c-a8d8-21a877c690e8/kube-rbac-proxy/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.891461 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54468f9998-5pr6c_ecdf8dc0-fab8-477f-ad9e-d0c0b0b83e45/manager/0.log" Dec 03 13:50:55 crc kubenswrapper[4666]: I1203 13:50:55.977215 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-bcvbt_494c67d4-f61e-468c-a8d8-21a877c690e8/manager/0.log" Dec 03 13:50:56 crc kubenswrapper[4666]: I1203 13:50:56.081169 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xwp75_39681ef6-2d50-4509-a81e-d6cd102695cd/kube-rbac-proxy/0.log" Dec 03 13:50:56 crc kubenswrapper[4666]: I1203 13:50:56.122786 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xwp75_39681ef6-2d50-4509-a81e-d6cd102695cd/manager/0.log" Dec 03 13:50:56 crc kubenswrapper[4666]: I1203 13:50:56.219658 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xmmjr_f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f/kube-rbac-proxy/0.log" Dec 03 13:50:56 crc kubenswrapper[4666]: I1203 13:50:56.304015 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xmmjr_f74d03e2-c8ec-4e79-af8b-0fa1a5baff0f/manager/0.log" Dec 03 13:51:13 crc kubenswrapper[4666]: I1203 13:51:13.899242 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bsdf6_61eb4df3-6d70-43f8-aaf6-9ad6b8f2abae/control-plane-machine-set-operator/0.log" Dec 03 13:51:14 crc kubenswrapper[4666]: I1203 13:51:14.095929 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dn4sx_7d685477-11f2-4bfb-98c2-6eb76b6697c3/kube-rbac-proxy/0.log" Dec 03 13:51:14 crc kubenswrapper[4666]: I1203 13:51:14.096395 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dn4sx_7d685477-11f2-4bfb-98c2-6eb76b6697c3/machine-api-operator/0.log" Dec 03 13:51:25 crc kubenswrapper[4666]: I1203 13:51:25.527462 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-q94td_100d1193-8c3e-442e-8e9c-9983b5292555/cert-manager-controller/0.log" Dec 03 13:51:25 crc kubenswrapper[4666]: I1203 13:51:25.662107 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-g7hwp_9dafb972-28e7-4392-9d8f-0d6036c5adab/cert-manager-cainjector/0.log" Dec 03 13:51:25 crc kubenswrapper[4666]: I1203 13:51:25.705393 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nf766_b7fc3a3d-9867-4055-b071-c43574b66e7a/cert-manager-webhook/0.log" Dec 03 13:51:37 crc kubenswrapper[4666]: I1203 13:51:37.158055 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-m4v87_8f7dae47-5e1c-4945-9827-33a00c4c0d66/nmstate-console-plugin/0.log" Dec 03 13:51:37 crc kubenswrapper[4666]: I1203 13:51:37.285958 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7x7ll_02c230c0-43ab-4476-b3dd-64cb686195c0/nmstate-handler/0.log" Dec 03 13:51:37 crc kubenswrapper[4666]: I1203 13:51:37.318805 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5gp27_65d7f250-10bf-4a17-879f-856d2ea16b91/kube-rbac-proxy/0.log" Dec 03 13:51:37 crc kubenswrapper[4666]: I1203 13:51:37.358167 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5gp27_65d7f250-10bf-4a17-879f-856d2ea16b91/nmstate-metrics/0.log" Dec 03 13:51:37 crc kubenswrapper[4666]: I1203 13:51:37.529784 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-wqnxn_c28a05b1-6eb0-43a1-a581-c8a5f3b956b6/nmstate-operator/0.log" Dec 03 13:51:37 crc kubenswrapper[4666]: I1203 13:51:37.538926 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-vvj59_f43cdd10-999d-470a-89d0-909660ec7e67/nmstate-webhook/0.log" Dec 03 13:51:50 crc kubenswrapper[4666]: I1203 13:51:50.924892 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-4zn7s_782afea8-e67f-4724-992b-6d318c9f9e5c/kube-rbac-proxy/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.032774 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-4zn7s_782afea8-e67f-4724-992b-6d318c9f9e5c/controller/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.198257 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-xjlzk_eaa2e763-b8bc-4f22-9bbf-43d36d8c2088/frr-k8s-webhook-server/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.232633 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.498010 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.518733 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.554448 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.572478 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.779397 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.789831 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.796406 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.819289 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:51:51 crc kubenswrapper[4666]: I1203 13:51:51.996646 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-metrics/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.020620 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-frr-files/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.025445 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/controller/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.029597 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/cp-reloader/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.208873 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/kube-rbac-proxy-frr/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.236588 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/frr-metrics/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.258139 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/kube-rbac-proxy/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.478631 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/reloader/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.535451 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-b8d9b7676-d2hwb_c594fca4-0d6a-47e1-acc4-b9434ce17bb9/manager/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.764254 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-868f45c797-svsbn_9606ffd7-351c-4485-b17a-779a724a1859/webhook-server/0.log" Dec 03 13:51:52 crc kubenswrapper[4666]: I1203 13:51:52.975522 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27md4_c1d0b522-d828-4f16-9d3b-64b16697898a/kube-rbac-proxy/0.log" Dec 03 13:51:53 crc kubenswrapper[4666]: I1203 13:51:53.406544 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27md4_c1d0b522-d828-4f16-9d3b-64b16697898a/speaker/0.log" Dec 03 13:51:53 crc kubenswrapper[4666]: I1203 13:51:53.695835 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxmn_b1b3e6c6-2466-465d-8b6a-13c75e60ed62/frr/0.log" Dec 03 13:52:05 crc kubenswrapper[4666]: I1203 13:52:05.768375 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/util/0.log" Dec 03 13:52:05 crc kubenswrapper[4666]: I1203 13:52:05.891665 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/pull/0.log" Dec 03 13:52:05 crc kubenswrapper[4666]: I1203 13:52:05.901173 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/util/0.log" Dec 03 13:52:05 crc kubenswrapper[4666]: I1203 13:52:05.946162 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/pull/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.138331 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/extract/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.169124 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/util/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.172268 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkv98l_49e29685-fddc-4db1-acbf-07d806a3280e/pull/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.362251 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/util/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.546600 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/util/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.570407 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/pull/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.593763 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/pull/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.727072 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/util/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.774226 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/extract/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.813551 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wrsqg_8d46dfdf-d48c-494f-9535-b3d6c05f3b45/pull/0.log" Dec 03 13:52:06 crc kubenswrapper[4666]: I1203 13:52:06.923016 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-utilities/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.122536 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-content/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.144703 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-content/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.160142 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-utilities/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.354037 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-utilities/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.394390 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/extract-content/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.587449 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-utilities/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.826077 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-content/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.828197 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-utilities/0.log" Dec 03 13:52:07 crc kubenswrapper[4666]: I1203 13:52:07.867821 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-content/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.013696 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqvpl_74eb5845-66cb-4c54-a2c7-53f59a686e0d/registry-server/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.036584 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-utilities/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.059751 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/extract-content/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.277205 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2vrds_d829bfc6-3cf6-4b30-a501-1586386d7698/marketplace-operator/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.452790 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-utilities/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.758134 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-content/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.786462 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5v6kg_4e039eec-3ce6-475e-9e89-8dc64fd04701/registry-server/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.790656 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-utilities/0.log" Dec 03 13:52:08 crc kubenswrapper[4666]: I1203 13:52:08.834747 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-content/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.001595 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-content/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.061002 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/extract-utilities/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.190677 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2fsqg_8229727b-4723-46f6-919d-1eb721caefd1/registry-server/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.265252 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-utilities/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.420495 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-content/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.463675 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-content/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.470397 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-utilities/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.708774 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-content/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.795027 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/extract-utilities/0.log" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.889328 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n4bk2"] Dec 03 13:52:09 crc kubenswrapper[4666]: E1203 13:52:09.890073 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af7b87e3-18af-42e1-ae0f-898aef4b8223" containerName="container-00" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.890089 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7b87e3-18af-42e1-ae0f-898aef4b8223" containerName="container-00" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.890300 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="af7b87e3-18af-42e1-ae0f-898aef4b8223" containerName="container-00" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.892886 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:09 crc kubenswrapper[4666]: I1203 13:52:09.910719 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4bk2"] Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.059289 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-catalog-content\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.059433 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz265\" (UniqueName: \"kubernetes.io/projected/56ca4507-d945-49db-96d3-b2e5bb3a88ca-kube-api-access-qz265\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.059500 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-utilities\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.161690 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-catalog-content\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.161789 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz265\" (UniqueName: \"kubernetes.io/projected/56ca4507-d945-49db-96d3-b2e5bb3a88ca-kube-api-access-qz265\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.161829 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-utilities\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.162302 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-utilities\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.162503 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-catalog-content\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.213941 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz265\" (UniqueName: \"kubernetes.io/projected/56ca4507-d945-49db-96d3-b2e5bb3a88ca-kube-api-access-qz265\") pod \"redhat-marketplace-n4bk2\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.241596 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwb2h_82ff889b-3fab-481d-b9a0-36991fb87e8f/registry-server/0.log" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.261853 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:10 crc kubenswrapper[4666]: I1203 13:52:10.766598 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4bk2"] Dec 03 13:52:11 crc kubenswrapper[4666]: I1203 13:52:11.161211 4666 generic.go:334] "Generic (PLEG): container finished" podID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerID="62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d" exitCode=0 Dec 03 13:52:11 crc kubenswrapper[4666]: I1203 13:52:11.161279 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4bk2" event={"ID":"56ca4507-d945-49db-96d3-b2e5bb3a88ca","Type":"ContainerDied","Data":"62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d"} Dec 03 13:52:11 crc kubenswrapper[4666]: I1203 13:52:11.161564 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4bk2" event={"ID":"56ca4507-d945-49db-96d3-b2e5bb3a88ca","Type":"ContainerStarted","Data":"8c44b3810f99fa8e2023a79fb2a44ee4535862ba316db702909816237bc0a79d"} Dec 03 13:52:11 crc kubenswrapper[4666]: E1203 13:52:11.189508 4666 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ca4507_d945_49db_96d3_b2e5bb3a88ca.slice/crio-conmon-62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ca4507_d945_49db_96d3_b2e5bb3a88ca.slice/crio-62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d.scope\": RecentStats: unable to find data in memory cache]" Dec 03 13:52:12 crc kubenswrapper[4666]: I1203 13:52:12.171192 4666 generic.go:334] "Generic (PLEG): container finished" podID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerID="cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707" exitCode=0 Dec 03 13:52:12 crc kubenswrapper[4666]: I1203 13:52:12.171294 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4bk2" event={"ID":"56ca4507-d945-49db-96d3-b2e5bb3a88ca","Type":"ContainerDied","Data":"cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707"} Dec 03 13:52:13 crc kubenswrapper[4666]: I1203 13:52:13.181694 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4bk2" event={"ID":"56ca4507-d945-49db-96d3-b2e5bb3a88ca","Type":"ContainerStarted","Data":"4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24"} Dec 03 13:52:13 crc kubenswrapper[4666]: I1203 13:52:13.207950 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n4bk2" podStartSLOduration=2.753448599 podStartE2EDuration="4.207923993s" podCreationTimestamp="2025-12-03 13:52:09 +0000 UTC" firstStartedPulling="2025-12-03 13:52:11.167501963 +0000 UTC m=+5920.012463014" lastFinishedPulling="2025-12-03 13:52:12.621977357 +0000 UTC m=+5921.466938408" observedRunningTime="2025-12-03 13:52:13.19886868 +0000 UTC m=+5922.043829731" watchObservedRunningTime="2025-12-03 13:52:13.207923993 +0000 UTC m=+5922.052885054" Dec 03 13:52:20 crc kubenswrapper[4666]: I1203 13:52:20.262568 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:20 crc kubenswrapper[4666]: I1203 13:52:20.263170 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:20 crc kubenswrapper[4666]: I1203 13:52:20.318224 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:21 crc kubenswrapper[4666]: I1203 13:52:21.297738 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:21 crc kubenswrapper[4666]: I1203 13:52:21.455080 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4bk2"] Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.265880 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n4bk2" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="registry-server" containerID="cri-o://4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24" gracePeriod=2 Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.771618 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.825605 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-utilities\") pod \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.825722 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-catalog-content\") pod \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.825763 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz265\" (UniqueName: \"kubernetes.io/projected/56ca4507-d945-49db-96d3-b2e5bb3a88ca-kube-api-access-qz265\") pod \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\" (UID: \"56ca4507-d945-49db-96d3-b2e5bb3a88ca\") " Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.828594 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-utilities" (OuterVolumeSpecName: "utilities") pod "56ca4507-d945-49db-96d3-b2e5bb3a88ca" (UID: "56ca4507-d945-49db-96d3-b2e5bb3a88ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.832366 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ca4507-d945-49db-96d3-b2e5bb3a88ca-kube-api-access-qz265" (OuterVolumeSpecName: "kube-api-access-qz265") pod "56ca4507-d945-49db-96d3-b2e5bb3a88ca" (UID: "56ca4507-d945-49db-96d3-b2e5bb3a88ca"). InnerVolumeSpecName "kube-api-access-qz265". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.853054 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56ca4507-d945-49db-96d3-b2e5bb3a88ca" (UID: "56ca4507-d945-49db-96d3-b2e5bb3a88ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.928472 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz265\" (UniqueName: \"kubernetes.io/projected/56ca4507-d945-49db-96d3-b2e5bb3a88ca-kube-api-access-qz265\") on node \"crc\" DevicePath \"\"" Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.928516 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:52:23 crc kubenswrapper[4666]: I1203 13:52:23.928533 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ca4507-d945-49db-96d3-b2e5bb3a88ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.278304 4666 generic.go:334] "Generic (PLEG): container finished" podID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerID="4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24" exitCode=0 Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.278394 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4bk2" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.278446 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4bk2" event={"ID":"56ca4507-d945-49db-96d3-b2e5bb3a88ca","Type":"ContainerDied","Data":"4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24"} Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.278477 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4bk2" event={"ID":"56ca4507-d945-49db-96d3-b2e5bb3a88ca","Type":"ContainerDied","Data":"8c44b3810f99fa8e2023a79fb2a44ee4535862ba316db702909816237bc0a79d"} Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.278496 4666 scope.go:117] "RemoveContainer" containerID="4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.300199 4666 scope.go:117] "RemoveContainer" containerID="cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.325320 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4bk2"] Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.331307 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4bk2"] Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.337617 4666 scope.go:117] "RemoveContainer" containerID="62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.377793 4666 scope.go:117] "RemoveContainer" containerID="4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24" Dec 03 13:52:24 crc kubenswrapper[4666]: E1203 13:52:24.378383 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24\": container with ID starting with 4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24 not found: ID does not exist" containerID="4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.378428 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24"} err="failed to get container status \"4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24\": rpc error: code = NotFound desc = could not find container \"4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24\": container with ID starting with 4853921ff1edeead424a91bf0f054aa8f9ca65d2d1bd30a26dedea000882ef24 not found: ID does not exist" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.378458 4666 scope.go:117] "RemoveContainer" containerID="cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707" Dec 03 13:52:24 crc kubenswrapper[4666]: E1203 13:52:24.379825 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707\": container with ID starting with cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707 not found: ID does not exist" containerID="cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.379859 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707"} err="failed to get container status \"cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707\": rpc error: code = NotFound desc = could not find container \"cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707\": container with ID starting with cc1250aa14b2db2b25d10fc7b967a721e31a58dae7b182f3d6243c0dbeede707 not found: ID does not exist" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.379881 4666 scope.go:117] "RemoveContainer" containerID="62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d" Dec 03 13:52:24 crc kubenswrapper[4666]: E1203 13:52:24.380341 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d\": container with ID starting with 62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d not found: ID does not exist" containerID="62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d" Dec 03 13:52:24 crc kubenswrapper[4666]: I1203 13:52:24.380390 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d"} err="failed to get container status \"62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d\": rpc error: code = NotFound desc = could not find container \"62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d\": container with ID starting with 62a6090b52da741e86cd649658eccc40b784b72325bd37f2613b36e9e092b30d not found: ID does not exist" Dec 03 13:52:25 crc kubenswrapper[4666]: I1203 13:52:25.435333 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" path="/var/lib/kubelet/pods/56ca4507-d945-49db-96d3-b2e5bb3a88ca/volumes" Dec 03 13:52:39 crc kubenswrapper[4666]: I1203 13:52:39.866488 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:52:39 crc kubenswrapper[4666]: I1203 13:52:39.866844 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:52:40 crc kubenswrapper[4666]: E1203 13:52:40.885079 4666 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:45338->38.102.83.110:44485: write tcp 38.102.83.110:45338->38.102.83.110:44485: write: broken pipe Dec 03 13:53:09 crc kubenswrapper[4666]: I1203 13:53:09.866341 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:53:09 crc kubenswrapper[4666]: I1203 13:53:09.868350 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:53:39 crc kubenswrapper[4666]: I1203 13:53:39.866709 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 13:53:39 crc kubenswrapper[4666]: I1203 13:53:39.867374 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 13:53:39 crc kubenswrapper[4666]: I1203 13:53:39.867428 4666 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" Dec 03 13:53:39 crc kubenswrapper[4666]: I1203 13:53:39.868453 4666 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820"} pod="openshift-machine-config-operator/machine-config-daemon-q9g72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 13:53:39 crc kubenswrapper[4666]: I1203 13:53:39.868506 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" containerID="cri-o://32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" gracePeriod=600 Dec 03 13:53:40 crc kubenswrapper[4666]: E1203 13:53:40.499369 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:53:40 crc kubenswrapper[4666]: I1203 13:53:40.989014 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerDied","Data":"32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820"} Dec 03 13:53:40 crc kubenswrapper[4666]: I1203 13:53:40.989255 4666 scope.go:117] "RemoveContainer" containerID="a9cc00e19a46e3efb1f89908932881acb239a1fb796cadd29fdfb9a733ca6dd2" Dec 03 13:53:40 crc kubenswrapper[4666]: I1203 13:53:40.989027 4666 generic.go:334] "Generic (PLEG): container finished" podID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" exitCode=0 Dec 03 13:53:40 crc kubenswrapper[4666]: I1203 13:53:40.989902 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:53:40 crc kubenswrapper[4666]: E1203 13:53:40.990232 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:53:51 crc kubenswrapper[4666]: I1203 13:53:51.431738 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:53:51 crc kubenswrapper[4666]: E1203 13:53:51.432636 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:54:02 crc kubenswrapper[4666]: I1203 13:54:02.249070 4666 generic.go:334] "Generic (PLEG): container finished" podID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerID="fbf065bb3359a0290f48ca467acaa05eb19fb2ebda08ddbfbb48dbcdadb12650" exitCode=0 Dec 03 13:54:02 crc kubenswrapper[4666]: I1203 13:54:02.249135 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-crktr/must-gather-psf59" event={"ID":"7a392c6a-570e-4785-b3a2-e3bb2aa176f5","Type":"ContainerDied","Data":"fbf065bb3359a0290f48ca467acaa05eb19fb2ebda08ddbfbb48dbcdadb12650"} Dec 03 13:54:02 crc kubenswrapper[4666]: I1203 13:54:02.250627 4666 scope.go:117] "RemoveContainer" containerID="fbf065bb3359a0290f48ca467acaa05eb19fb2ebda08ddbfbb48dbcdadb12650" Dec 03 13:54:02 crc kubenswrapper[4666]: I1203 13:54:02.603175 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crktr_must-gather-psf59_7a392c6a-570e-4785-b3a2-e3bb2aa176f5/gather/0.log" Dec 03 13:54:04 crc kubenswrapper[4666]: I1203 13:54:04.423210 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:54:04 crc kubenswrapper[4666]: E1203 13:54:04.425681 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.195547 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-crktr/must-gather-psf59"] Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.196353 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-crktr/must-gather-psf59" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerName="copy" containerID="cri-o://f0343179ed702fd94feb3cd00c9584fd2155535d64c8f4b82412df20725015ff" gracePeriod=2 Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.211053 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-crktr/must-gather-psf59"] Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.377858 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crktr_must-gather-psf59_7a392c6a-570e-4785-b3a2-e3bb2aa176f5/copy/0.log" Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.378896 4666 generic.go:334] "Generic (PLEG): container finished" podID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerID="f0343179ed702fd94feb3cd00c9584fd2155535d64c8f4b82412df20725015ff" exitCode=143 Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.655698 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crktr_must-gather-psf59_7a392c6a-570e-4785-b3a2-e3bb2aa176f5/copy/0.log" Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.656462 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.769207 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-must-gather-output\") pod \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.769294 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x2vq\" (UniqueName: \"kubernetes.io/projected/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-kube-api-access-2x2vq\") pod \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\" (UID: \"7a392c6a-570e-4785-b3a2-e3bb2aa176f5\") " Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.776146 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-kube-api-access-2x2vq" (OuterVolumeSpecName: "kube-api-access-2x2vq") pod "7a392c6a-570e-4785-b3a2-e3bb2aa176f5" (UID: "7a392c6a-570e-4785-b3a2-e3bb2aa176f5"). InnerVolumeSpecName "kube-api-access-2x2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.871629 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x2vq\" (UniqueName: \"kubernetes.io/projected/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-kube-api-access-2x2vq\") on node \"crc\" DevicePath \"\"" Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.937582 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7a392c6a-570e-4785-b3a2-e3bb2aa176f5" (UID: "7a392c6a-570e-4785-b3a2-e3bb2aa176f5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:54:13 crc kubenswrapper[4666]: I1203 13:54:13.973076 4666 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a392c6a-570e-4785-b3a2-e3bb2aa176f5-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 13:54:14 crc kubenswrapper[4666]: I1203 13:54:14.406659 4666 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-crktr_must-gather-psf59_7a392c6a-570e-4785-b3a2-e3bb2aa176f5/copy/0.log" Dec 03 13:54:14 crc kubenswrapper[4666]: I1203 13:54:14.408248 4666 scope.go:117] "RemoveContainer" containerID="f0343179ed702fd94feb3cd00c9584fd2155535d64c8f4b82412df20725015ff" Dec 03 13:54:14 crc kubenswrapper[4666]: I1203 13:54:14.408307 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-crktr/must-gather-psf59" Dec 03 13:54:14 crc kubenswrapper[4666]: I1203 13:54:14.460211 4666 scope.go:117] "RemoveContainer" containerID="fbf065bb3359a0290f48ca467acaa05eb19fb2ebda08ddbfbb48dbcdadb12650" Dec 03 13:54:15 crc kubenswrapper[4666]: I1203 13:54:15.451285 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" path="/var/lib/kubelet/pods/7a392c6a-570e-4785-b3a2-e3bb2aa176f5/volumes" Dec 03 13:54:19 crc kubenswrapper[4666]: I1203 13:54:19.424180 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:54:19 crc kubenswrapper[4666]: E1203 13:54:19.425127 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:54:30 crc kubenswrapper[4666]: I1203 13:54:30.423591 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:54:30 crc kubenswrapper[4666]: E1203 13:54:30.424391 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:54:45 crc kubenswrapper[4666]: I1203 13:54:45.423891 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:54:45 crc kubenswrapper[4666]: E1203 13:54:45.425622 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:54:54 crc kubenswrapper[4666]: I1203 13:54:54.474772 4666 scope.go:117] "RemoveContainer" containerID="e26b865d275e439408b805d25ff97f506eb39acc7090a9e263fd6b0c197a21f8" Dec 03 13:54:58 crc kubenswrapper[4666]: I1203 13:54:58.423578 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:54:58 crc kubenswrapper[4666]: E1203 13:54:58.425340 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:55:11 crc kubenswrapper[4666]: I1203 13:55:11.433012 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:55:11 crc kubenswrapper[4666]: E1203 13:55:11.434846 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:55:22 crc kubenswrapper[4666]: I1203 13:55:22.424514 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:55:22 crc kubenswrapper[4666]: E1203 13:55:22.425940 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.863718 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hr9f"] Dec 03 13:55:34 crc kubenswrapper[4666]: E1203 13:55:34.866778 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="extract-content" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.867362 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="extract-content" Dec 03 13:55:34 crc kubenswrapper[4666]: E1203 13:55:34.867383 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerName="gather" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.867390 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerName="gather" Dec 03 13:55:34 crc kubenswrapper[4666]: E1203 13:55:34.867412 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="registry-server" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.867418 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="registry-server" Dec 03 13:55:34 crc kubenswrapper[4666]: E1203 13:55:34.867425 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="extract-utilities" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.867431 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="extract-utilities" Dec 03 13:55:34 crc kubenswrapper[4666]: E1203 13:55:34.867451 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerName="copy" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.867456 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerName="copy" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.868471 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ca4507-d945-49db-96d3-b2e5bb3a88ca" containerName="registry-server" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.868495 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerName="gather" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.868514 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a392c6a-570e-4785-b3a2-e3bb2aa176f5" containerName="copy" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.869931 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:34 crc kubenswrapper[4666]: I1203 13:55:34.874967 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hr9f"] Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.049672 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-catalog-content\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.049729 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xz7q\" (UniqueName: \"kubernetes.io/projected/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-kube-api-access-2xz7q\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.049761 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-utilities\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.151313 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-catalog-content\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.151393 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xz7q\" (UniqueName: \"kubernetes.io/projected/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-kube-api-access-2xz7q\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.151441 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-utilities\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.151970 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-catalog-content\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.152076 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-utilities\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.177574 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xz7q\" (UniqueName: \"kubernetes.io/projected/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-kube-api-access-2xz7q\") pod \"community-operators-7hr9f\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.199877 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.428424 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:55:35 crc kubenswrapper[4666]: E1203 13:55:35.434022 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:55:35 crc kubenswrapper[4666]: I1203 13:55:35.751884 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hr9f"] Dec 03 13:55:36 crc kubenswrapper[4666]: I1203 13:55:36.160516 4666 generic.go:334] "Generic (PLEG): container finished" podID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerID="7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d" exitCode=0 Dec 03 13:55:36 crc kubenswrapper[4666]: I1203 13:55:36.160560 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hr9f" event={"ID":"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a","Type":"ContainerDied","Data":"7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d"} Dec 03 13:55:36 crc kubenswrapper[4666]: I1203 13:55:36.160589 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hr9f" event={"ID":"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a","Type":"ContainerStarted","Data":"eb8698cec158c1aa188eab2c2683e0539ae4530850c3ccbae5b2a74ab35e38f0"} Dec 03 13:55:36 crc kubenswrapper[4666]: I1203 13:55:36.162768 4666 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 13:55:37 crc kubenswrapper[4666]: I1203 13:55:37.171922 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hr9f" event={"ID":"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a","Type":"ContainerStarted","Data":"2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5"} Dec 03 13:55:38 crc kubenswrapper[4666]: I1203 13:55:38.181719 4666 generic.go:334] "Generic (PLEG): container finished" podID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerID="2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5" exitCode=0 Dec 03 13:55:38 crc kubenswrapper[4666]: I1203 13:55:38.181754 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hr9f" event={"ID":"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a","Type":"ContainerDied","Data":"2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5"} Dec 03 13:55:39 crc kubenswrapper[4666]: I1203 13:55:39.192651 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hr9f" event={"ID":"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a","Type":"ContainerStarted","Data":"39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2"} Dec 03 13:55:39 crc kubenswrapper[4666]: I1203 13:55:39.233808 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hr9f" podStartSLOduration=2.7699570060000003 podStartE2EDuration="5.233783494s" podCreationTimestamp="2025-12-03 13:55:34 +0000 UTC" firstStartedPulling="2025-12-03 13:55:36.162503852 +0000 UTC m=+6125.007464913" lastFinishedPulling="2025-12-03 13:55:38.62633035 +0000 UTC m=+6127.471291401" observedRunningTime="2025-12-03 13:55:39.218590605 +0000 UTC m=+6128.063551666" watchObservedRunningTime="2025-12-03 13:55:39.233783494 +0000 UTC m=+6128.078744565" Dec 03 13:55:45 crc kubenswrapper[4666]: I1203 13:55:45.200079 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:45 crc kubenswrapper[4666]: I1203 13:55:45.200673 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:45 crc kubenswrapper[4666]: I1203 13:55:45.249053 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:45 crc kubenswrapper[4666]: I1203 13:55:45.296613 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:45 crc kubenswrapper[4666]: I1203 13:55:45.485902 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hr9f"] Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.267031 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7hr9f" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="registry-server" containerID="cri-o://39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2" gracePeriod=2 Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.700438 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.802161 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-utilities\") pod \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.802439 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-catalog-content\") pod \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.802533 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xz7q\" (UniqueName: \"kubernetes.io/projected/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-kube-api-access-2xz7q\") pod \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\" (UID: \"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a\") " Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.803156 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-utilities" (OuterVolumeSpecName: "utilities") pod "cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" (UID: "cbe6c031-c4fb-4cb1-91f8-dcad3cda071a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.808955 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-kube-api-access-2xz7q" (OuterVolumeSpecName: "kube-api-access-2xz7q") pod "cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" (UID: "cbe6c031-c4fb-4cb1-91f8-dcad3cda071a"). InnerVolumeSpecName "kube-api-access-2xz7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.905320 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.905355 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xz7q\" (UniqueName: \"kubernetes.io/projected/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-kube-api-access-2xz7q\") on node \"crc\" DevicePath \"\"" Dec 03 13:55:47 crc kubenswrapper[4666]: I1203 13:55:47.941176 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" (UID: "cbe6c031-c4fb-4cb1-91f8-dcad3cda071a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.007872 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.297921 4666 generic.go:334] "Generic (PLEG): container finished" podID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerID="39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2" exitCode=0 Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.298004 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hr9f" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.298028 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hr9f" event={"ID":"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a","Type":"ContainerDied","Data":"39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2"} Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.298974 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hr9f" event={"ID":"cbe6c031-c4fb-4cb1-91f8-dcad3cda071a","Type":"ContainerDied","Data":"eb8698cec158c1aa188eab2c2683e0539ae4530850c3ccbae5b2a74ab35e38f0"} Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.298997 4666 scope.go:117] "RemoveContainer" containerID="39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.322323 4666 scope.go:117] "RemoveContainer" containerID="2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.331734 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hr9f"] Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.340270 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7hr9f"] Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.362190 4666 scope.go:117] "RemoveContainer" containerID="7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.412343 4666 scope.go:117] "RemoveContainer" containerID="39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2" Dec 03 13:55:48 crc kubenswrapper[4666]: E1203 13:55:48.413111 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2\": container with ID starting with 39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2 not found: ID does not exist" containerID="39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.413164 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2"} err="failed to get container status \"39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2\": rpc error: code = NotFound desc = could not find container \"39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2\": container with ID starting with 39dddd58d9db196ac3160a00f02596c218304b9675eec91696f6ebb2eb6c9ec2 not found: ID does not exist" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.413194 4666 scope.go:117] "RemoveContainer" containerID="2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5" Dec 03 13:55:48 crc kubenswrapper[4666]: E1203 13:55:48.413671 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5\": container with ID starting with 2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5 not found: ID does not exist" containerID="2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.413740 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5"} err="failed to get container status \"2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5\": rpc error: code = NotFound desc = could not find container \"2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5\": container with ID starting with 2a9851f2a25fdba7c17f3a742849ab4c0f407bf5bea3eadc0794c122ef55e6e5 not found: ID does not exist" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.413778 4666 scope.go:117] "RemoveContainer" containerID="7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d" Dec 03 13:55:48 crc kubenswrapper[4666]: E1203 13:55:48.414190 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d\": container with ID starting with 7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d not found: ID does not exist" containerID="7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.414242 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d"} err="failed to get container status \"7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d\": rpc error: code = NotFound desc = could not find container \"7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d\": container with ID starting with 7acb47b01462d910822ac48df2ed6e9e00b8347943d41d876c2005391b7eb54d not found: ID does not exist" Dec 03 13:55:48 crc kubenswrapper[4666]: I1203 13:55:48.423688 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:55:48 crc kubenswrapper[4666]: E1203 13:55:48.424076 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:55:49 crc kubenswrapper[4666]: I1203 13:55:49.441830 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" path="/var/lib/kubelet/pods/cbe6c031-c4fb-4cb1-91f8-dcad3cda071a/volumes" Dec 03 13:55:54 crc kubenswrapper[4666]: I1203 13:55:54.554908 4666 scope.go:117] "RemoveContainer" containerID="a563c3304661aae27601b23cfd23a903fba22ca3740082cf84bde2f9f5a51b66" Dec 03 13:55:59 crc kubenswrapper[4666]: I1203 13:55:59.423792 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:55:59 crc kubenswrapper[4666]: E1203 13:55:59.424600 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.150066 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zhzfm"] Dec 03 13:56:05 crc kubenswrapper[4666]: E1203 13:56:05.151166 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="extract-content" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.151184 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="extract-content" Dec 03 13:56:05 crc kubenswrapper[4666]: E1203 13:56:05.151222 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="extract-utilities" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.151232 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="extract-utilities" Dec 03 13:56:05 crc kubenswrapper[4666]: E1203 13:56:05.151255 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="registry-server" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.151263 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="registry-server" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.151519 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe6c031-c4fb-4cb1-91f8-dcad3cda071a" containerName="registry-server" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.153848 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.171314 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhzfm"] Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.251524 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-utilities\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.251594 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-catalog-content\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.251743 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trqdr\" (UniqueName: \"kubernetes.io/projected/f4ac87fd-a598-4281-a717-15cc9cc39601-kube-api-access-trqdr\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.353393 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-utilities\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.353468 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-catalog-content\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.353533 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trqdr\" (UniqueName: \"kubernetes.io/projected/f4ac87fd-a598-4281-a717-15cc9cc39601-kube-api-access-trqdr\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.354002 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-utilities\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.354115 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-catalog-content\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.372813 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trqdr\" (UniqueName: \"kubernetes.io/projected/f4ac87fd-a598-4281-a717-15cc9cc39601-kube-api-access-trqdr\") pod \"certified-operators-zhzfm\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.473307 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:05 crc kubenswrapper[4666]: I1203 13:56:05.981228 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhzfm"] Dec 03 13:56:06 crc kubenswrapper[4666]: I1203 13:56:06.477596 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerID="f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15" exitCode=0 Dec 03 13:56:06 crc kubenswrapper[4666]: I1203 13:56:06.477707 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhzfm" event={"ID":"f4ac87fd-a598-4281-a717-15cc9cc39601","Type":"ContainerDied","Data":"f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15"} Dec 03 13:56:06 crc kubenswrapper[4666]: I1203 13:56:06.477912 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhzfm" event={"ID":"f4ac87fd-a598-4281-a717-15cc9cc39601","Type":"ContainerStarted","Data":"35d1302e8508ba78f52569270862606aa80f8dc7814773749c91761c5248a487"} Dec 03 13:56:07 crc kubenswrapper[4666]: I1203 13:56:07.489165 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhzfm" event={"ID":"f4ac87fd-a598-4281-a717-15cc9cc39601","Type":"ContainerStarted","Data":"d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01"} Dec 03 13:56:08 crc kubenswrapper[4666]: I1203 13:56:08.498811 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerID="d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01" exitCode=0 Dec 03 13:56:08 crc kubenswrapper[4666]: I1203 13:56:08.498852 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhzfm" event={"ID":"f4ac87fd-a598-4281-a717-15cc9cc39601","Type":"ContainerDied","Data":"d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01"} Dec 03 13:56:09 crc kubenswrapper[4666]: I1203 13:56:09.508559 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhzfm" event={"ID":"f4ac87fd-a598-4281-a717-15cc9cc39601","Type":"ContainerStarted","Data":"0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8"} Dec 03 13:56:09 crc kubenswrapper[4666]: I1203 13:56:09.526525 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zhzfm" podStartSLOduration=2.069394825 podStartE2EDuration="4.526506023s" podCreationTimestamp="2025-12-03 13:56:05 +0000 UTC" firstStartedPulling="2025-12-03 13:56:06.48048288 +0000 UTC m=+6155.325443951" lastFinishedPulling="2025-12-03 13:56:08.937594098 +0000 UTC m=+6157.782555149" observedRunningTime="2025-12-03 13:56:09.522942337 +0000 UTC m=+6158.367903388" watchObservedRunningTime="2025-12-03 13:56:09.526506023 +0000 UTC m=+6158.371467074" Dec 03 13:56:14 crc kubenswrapper[4666]: I1203 13:56:14.423343 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:56:14 crc kubenswrapper[4666]: E1203 13:56:14.424184 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:56:15 crc kubenswrapper[4666]: I1203 13:56:15.473653 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:15 crc kubenswrapper[4666]: I1203 13:56:15.473719 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:15 crc kubenswrapper[4666]: I1203 13:56:15.527517 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:15 crc kubenswrapper[4666]: I1203 13:56:15.635559 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:15 crc kubenswrapper[4666]: I1203 13:56:15.766496 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhzfm"] Dec 03 13:56:17 crc kubenswrapper[4666]: I1203 13:56:17.590363 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zhzfm" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="registry-server" containerID="cri-o://0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8" gracePeriod=2 Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.061856 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.163127 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-utilities\") pod \"f4ac87fd-a598-4281-a717-15cc9cc39601\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.163682 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trqdr\" (UniqueName: \"kubernetes.io/projected/f4ac87fd-a598-4281-a717-15cc9cc39601-kube-api-access-trqdr\") pod \"f4ac87fd-a598-4281-a717-15cc9cc39601\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.163870 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-catalog-content\") pod \"f4ac87fd-a598-4281-a717-15cc9cc39601\" (UID: \"f4ac87fd-a598-4281-a717-15cc9cc39601\") " Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.164613 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-utilities" (OuterVolumeSpecName: "utilities") pod "f4ac87fd-a598-4281-a717-15cc9cc39601" (UID: "f4ac87fd-a598-4281-a717-15cc9cc39601"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.170903 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ac87fd-a598-4281-a717-15cc9cc39601-kube-api-access-trqdr" (OuterVolumeSpecName: "kube-api-access-trqdr") pod "f4ac87fd-a598-4281-a717-15cc9cc39601" (UID: "f4ac87fd-a598-4281-a717-15cc9cc39601"). InnerVolumeSpecName "kube-api-access-trqdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.266053 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4ac87fd-a598-4281-a717-15cc9cc39601" (UID: "f4ac87fd-a598-4281-a717-15cc9cc39601"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.267711 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trqdr\" (UniqueName: \"kubernetes.io/projected/f4ac87fd-a598-4281-a717-15cc9cc39601-kube-api-access-trqdr\") on node \"crc\" DevicePath \"\"" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.267773 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.267798 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ac87fd-a598-4281-a717-15cc9cc39601-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.610394 4666 generic.go:334] "Generic (PLEG): container finished" podID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerID="0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8" exitCode=0 Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.610474 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhzfm" event={"ID":"f4ac87fd-a598-4281-a717-15cc9cc39601","Type":"ContainerDied","Data":"0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8"} Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.610541 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhzfm" event={"ID":"f4ac87fd-a598-4281-a717-15cc9cc39601","Type":"ContainerDied","Data":"35d1302e8508ba78f52569270862606aa80f8dc7814773749c91761c5248a487"} Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.610573 4666 scope.go:117] "RemoveContainer" containerID="0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.610571 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhzfm" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.638537 4666 scope.go:117] "RemoveContainer" containerID="d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.661590 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhzfm"] Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.671033 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zhzfm"] Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.678437 4666 scope.go:117] "RemoveContainer" containerID="f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.721764 4666 scope.go:117] "RemoveContainer" containerID="0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8" Dec 03 13:56:18 crc kubenswrapper[4666]: E1203 13:56:18.722570 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8\": container with ID starting with 0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8 not found: ID does not exist" containerID="0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.722639 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8"} err="failed to get container status \"0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8\": rpc error: code = NotFound desc = could not find container \"0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8\": container with ID starting with 0e8d512429f895c2b2d472b1a304d00b40d80b81e16b9a95446b0795f520a3b8 not found: ID does not exist" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.722683 4666 scope.go:117] "RemoveContainer" containerID="d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01" Dec 03 13:56:18 crc kubenswrapper[4666]: E1203 13:56:18.723384 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01\": container with ID starting with d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01 not found: ID does not exist" containerID="d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.723442 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01"} err="failed to get container status \"d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01\": rpc error: code = NotFound desc = could not find container \"d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01\": container with ID starting with d1456f61ae7d3346338d3b953baf9a2558957de4fdc3173ae8212490eaf7ef01 not found: ID does not exist" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.723484 4666 scope.go:117] "RemoveContainer" containerID="f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15" Dec 03 13:56:18 crc kubenswrapper[4666]: E1203 13:56:18.723947 4666 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15\": container with ID starting with f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15 not found: ID does not exist" containerID="f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15" Dec 03 13:56:18 crc kubenswrapper[4666]: I1203 13:56:18.724003 4666 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15"} err="failed to get container status \"f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15\": rpc error: code = NotFound desc = could not find container \"f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15\": container with ID starting with f61f543e0c13cb4d77444083dc25cead8a7d51faae8eac27268586fe9bcebb15 not found: ID does not exist" Dec 03 13:56:19 crc kubenswrapper[4666]: I1203 13:56:19.448892 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" path="/var/lib/kubelet/pods/f4ac87fd-a598-4281-a717-15cc9cc39601/volumes" Dec 03 13:56:28 crc kubenswrapper[4666]: I1203 13:56:28.424032 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:56:28 crc kubenswrapper[4666]: E1203 13:56:28.425235 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:56:41 crc kubenswrapper[4666]: I1203 13:56:41.440166 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:56:41 crc kubenswrapper[4666]: E1203 13:56:41.443520 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:56:56 crc kubenswrapper[4666]: I1203 13:56:56.424153 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:56:56 crc kubenswrapper[4666]: E1203 13:56:56.424997 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:57:08 crc kubenswrapper[4666]: I1203 13:57:08.424027 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:57:08 crc kubenswrapper[4666]: E1203 13:57:08.424952 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:57:19 crc kubenswrapper[4666]: I1203 13:57:19.423317 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:57:19 crc kubenswrapper[4666]: E1203 13:57:19.424066 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:57:27 crc kubenswrapper[4666]: I1203 13:57:27.957418 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8vfzt"] Dec 03 13:57:27 crc kubenswrapper[4666]: E1203 13:57:27.958463 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="extract-content" Dec 03 13:57:27 crc kubenswrapper[4666]: I1203 13:57:27.958481 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="extract-content" Dec 03 13:57:27 crc kubenswrapper[4666]: E1203 13:57:27.958506 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="registry-server" Dec 03 13:57:27 crc kubenswrapper[4666]: I1203 13:57:27.958515 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="registry-server" Dec 03 13:57:27 crc kubenswrapper[4666]: E1203 13:57:27.958534 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="extract-utilities" Dec 03 13:57:27 crc kubenswrapper[4666]: I1203 13:57:27.958541 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="extract-utilities" Dec 03 13:57:27 crc kubenswrapper[4666]: I1203 13:57:27.958755 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ac87fd-a598-4281-a717-15cc9cc39601" containerName="registry-server" Dec 03 13:57:27 crc kubenswrapper[4666]: I1203 13:57:27.961561 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.023075 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8vfzt"] Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.099660 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-catalog-content\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.099985 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frndn\" (UniqueName: \"kubernetes.io/projected/5c4c9be0-5ef0-4cac-a315-85668c18f604-kube-api-access-frndn\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.100410 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-utilities\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.202859 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-utilities\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.202949 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-catalog-content\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.203051 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frndn\" (UniqueName: \"kubernetes.io/projected/5c4c9be0-5ef0-4cac-a315-85668c18f604-kube-api-access-frndn\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.203443 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-utilities\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.203555 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-catalog-content\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.236788 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frndn\" (UniqueName: \"kubernetes.io/projected/5c4c9be0-5ef0-4cac-a315-85668c18f604-kube-api-access-frndn\") pod \"redhat-operators-8vfzt\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.296668 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.837341 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8vfzt"] Dec 03 13:57:28 crc kubenswrapper[4666]: I1203 13:57:28.925750 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vfzt" event={"ID":"5c4c9be0-5ef0-4cac-a315-85668c18f604","Type":"ContainerStarted","Data":"477910845d4f6942b556b269351863c6773442512ec8eec8db56957e3abc5414"} Dec 03 13:57:29 crc kubenswrapper[4666]: I1203 13:57:29.934135 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerID="81cd6a9db237dcde5961e5f1bc4aefaedfd73d6fa4d8c58094f27149f33863cb" exitCode=0 Dec 03 13:57:29 crc kubenswrapper[4666]: I1203 13:57:29.934194 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vfzt" event={"ID":"5c4c9be0-5ef0-4cac-a315-85668c18f604","Type":"ContainerDied","Data":"81cd6a9db237dcde5961e5f1bc4aefaedfd73d6fa4d8c58094f27149f33863cb"} Dec 03 13:57:31 crc kubenswrapper[4666]: I1203 13:57:31.952779 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerID="946fed75b3c2e30eb8a6178766ab046bec0130b37f7844e20802bf20047ae602" exitCode=0 Dec 03 13:57:31 crc kubenswrapper[4666]: I1203 13:57:31.953045 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vfzt" event={"ID":"5c4c9be0-5ef0-4cac-a315-85668c18f604","Type":"ContainerDied","Data":"946fed75b3c2e30eb8a6178766ab046bec0130b37f7844e20802bf20047ae602"} Dec 03 13:57:32 crc kubenswrapper[4666]: I1203 13:57:32.424180 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:57:32 crc kubenswrapper[4666]: E1203 13:57:32.424441 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:57:33 crc kubenswrapper[4666]: I1203 13:57:33.978168 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vfzt" event={"ID":"5c4c9be0-5ef0-4cac-a315-85668c18f604","Type":"ContainerStarted","Data":"84e84c3fcda1748c743951dfc53e073b23e213e6ea0099e7a496b0e8ab477d0d"} Dec 03 13:57:34 crc kubenswrapper[4666]: I1203 13:57:33.998559 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8vfzt" podStartSLOduration=4.016033292 podStartE2EDuration="6.998541135s" podCreationTimestamp="2025-12-03 13:57:27 +0000 UTC" firstStartedPulling="2025-12-03 13:57:29.936215214 +0000 UTC m=+6238.781176265" lastFinishedPulling="2025-12-03 13:57:32.918723057 +0000 UTC m=+6241.763684108" observedRunningTime="2025-12-03 13:57:33.993019996 +0000 UTC m=+6242.837981057" watchObservedRunningTime="2025-12-03 13:57:33.998541135 +0000 UTC m=+6242.843502186" Dec 03 13:57:38 crc kubenswrapper[4666]: I1203 13:57:38.298128 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:38 crc kubenswrapper[4666]: I1203 13:57:38.298741 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:38 crc kubenswrapper[4666]: I1203 13:57:38.360366 4666 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:39 crc kubenswrapper[4666]: I1203 13:57:39.067650 4666 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:40 crc kubenswrapper[4666]: I1203 13:57:40.731705 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8vfzt"] Dec 03 13:57:41 crc kubenswrapper[4666]: I1203 13:57:41.034531 4666 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8vfzt" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="registry-server" containerID="cri-o://84e84c3fcda1748c743951dfc53e073b23e213e6ea0099e7a496b0e8ab477d0d" gracePeriod=2 Dec 03 13:57:42 crc kubenswrapper[4666]: I1203 13:57:42.045703 4666 generic.go:334] "Generic (PLEG): container finished" podID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerID="84e84c3fcda1748c743951dfc53e073b23e213e6ea0099e7a496b0e8ab477d0d" exitCode=0 Dec 03 13:57:42 crc kubenswrapper[4666]: I1203 13:57:42.045744 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vfzt" event={"ID":"5c4c9be0-5ef0-4cac-a315-85668c18f604","Type":"ContainerDied","Data":"84e84c3fcda1748c743951dfc53e073b23e213e6ea0099e7a496b0e8ab477d0d"} Dec 03 13:57:42 crc kubenswrapper[4666]: I1203 13:57:42.837033 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.009950 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frndn\" (UniqueName: \"kubernetes.io/projected/5c4c9be0-5ef0-4cac-a315-85668c18f604-kube-api-access-frndn\") pod \"5c4c9be0-5ef0-4cac-a315-85668c18f604\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.010055 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-catalog-content\") pod \"5c4c9be0-5ef0-4cac-a315-85668c18f604\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.010177 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-utilities\") pod \"5c4c9be0-5ef0-4cac-a315-85668c18f604\" (UID: \"5c4c9be0-5ef0-4cac-a315-85668c18f604\") " Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.010991 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-utilities" (OuterVolumeSpecName: "utilities") pod "5c4c9be0-5ef0-4cac-a315-85668c18f604" (UID: "5c4c9be0-5ef0-4cac-a315-85668c18f604"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.019074 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4c9be0-5ef0-4cac-a315-85668c18f604-kube-api-access-frndn" (OuterVolumeSpecName: "kube-api-access-frndn") pod "5c4c9be0-5ef0-4cac-a315-85668c18f604" (UID: "5c4c9be0-5ef0-4cac-a315-85668c18f604"). InnerVolumeSpecName "kube-api-access-frndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.056227 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vfzt" event={"ID":"5c4c9be0-5ef0-4cac-a315-85668c18f604","Type":"ContainerDied","Data":"477910845d4f6942b556b269351863c6773442512ec8eec8db56957e3abc5414"} Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.056282 4666 scope.go:117] "RemoveContainer" containerID="84e84c3fcda1748c743951dfc53e073b23e213e6ea0099e7a496b0e8ab477d0d" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.056435 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vfzt" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.092034 4666 scope.go:117] "RemoveContainer" containerID="946fed75b3c2e30eb8a6178766ab046bec0130b37f7844e20802bf20047ae602" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.113163 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frndn\" (UniqueName: \"kubernetes.io/projected/5c4c9be0-5ef0-4cac-a315-85668c18f604-kube-api-access-frndn\") on node \"crc\" DevicePath \"\"" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.113243 4666 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 13:57:43 crc kubenswrapper[4666]: I1203 13:57:43.113491 4666 scope.go:117] "RemoveContainer" containerID="81cd6a9db237dcde5961e5f1bc4aefaedfd73d6fa4d8c58094f27149f33863cb" Dec 03 13:57:44 crc kubenswrapper[4666]: I1203 13:57:44.371682 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c4c9be0-5ef0-4cac-a315-85668c18f604" (UID: "5c4c9be0-5ef0-4cac-a315-85668c18f604"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 13:57:44 crc kubenswrapper[4666]: I1203 13:57:44.442428 4666 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4c9be0-5ef0-4cac-a315-85668c18f604-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 13:57:44 crc kubenswrapper[4666]: I1203 13:57:44.594961 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8vfzt"] Dec 03 13:57:44 crc kubenswrapper[4666]: I1203 13:57:44.604578 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8vfzt"] Dec 03 13:57:45 crc kubenswrapper[4666]: I1203 13:57:45.452074 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" path="/var/lib/kubelet/pods/5c4c9be0-5ef0-4cac-a315-85668c18f604/volumes" Dec 03 13:57:46 crc kubenswrapper[4666]: I1203 13:57:46.423515 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:57:46 crc kubenswrapper[4666]: E1203 13:57:46.423992 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:57:58 crc kubenswrapper[4666]: I1203 13:57:58.423955 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:57:58 crc kubenswrapper[4666]: E1203 13:57:58.424968 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:58:10 crc kubenswrapper[4666]: I1203 13:58:10.423638 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:58:10 crc kubenswrapper[4666]: E1203 13:58:10.424391 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:58:23 crc kubenswrapper[4666]: I1203 13:58:23.424392 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:58:23 crc kubenswrapper[4666]: E1203 13:58:23.425438 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:58:36 crc kubenswrapper[4666]: I1203 13:58:36.423801 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:58:36 crc kubenswrapper[4666]: E1203 13:58:36.424518 4666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q9g72_openshift-machine-config-operator(782e76d3-8dbe-4c2e-952c-6a966e2c06a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" Dec 03 13:58:51 crc kubenswrapper[4666]: I1203 13:58:51.433328 4666 scope.go:117] "RemoveContainer" containerID="32fcfbae43af398286f96aac43da0e496bb19016ed672f83ac5084bb42d8f820" Dec 03 13:58:51 crc kubenswrapper[4666]: I1203 13:58:51.735252 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" event={"ID":"782e76d3-8dbe-4c2e-952c-6a966e2c06a2","Type":"ContainerStarted","Data":"37c59f013acfef3104026767580048f2fb6abdfed22331e2fa9acd809433dd5f"} Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.149290 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd"] Dec 03 14:00:00 crc kubenswrapper[4666]: E1203 14:00:00.150314 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="extract-content" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.150329 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="extract-content" Dec 03 14:00:00 crc kubenswrapper[4666]: E1203 14:00:00.150342 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="registry-server" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.150350 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="registry-server" Dec 03 14:00:00 crc kubenswrapper[4666]: E1203 14:00:00.150378 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="extract-utilities" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.150385 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="extract-utilities" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.150607 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4c9be0-5ef0-4cac-a315-85668c18f604" containerName="registry-server" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.151531 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.154662 4666 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.154901 4666 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.166016 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd"] Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.249070 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f08e235-f8c5-4d96-a9e7-1f9163279692-config-volume\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.249153 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f08e235-f8c5-4d96-a9e7-1f9163279692-secret-volume\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.249188 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v5gl\" (UniqueName: \"kubernetes.io/projected/5f08e235-f8c5-4d96-a9e7-1f9163279692-kube-api-access-6v5gl\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.350909 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f08e235-f8c5-4d96-a9e7-1f9163279692-config-volume\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.350978 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f08e235-f8c5-4d96-a9e7-1f9163279692-secret-volume\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.351013 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v5gl\" (UniqueName: \"kubernetes.io/projected/5f08e235-f8c5-4d96-a9e7-1f9163279692-kube-api-access-6v5gl\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.352205 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f08e235-f8c5-4d96-a9e7-1f9163279692-config-volume\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.358794 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f08e235-f8c5-4d96-a9e7-1f9163279692-secret-volume\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.371596 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v5gl\" (UniqueName: \"kubernetes.io/projected/5f08e235-f8c5-4d96-a9e7-1f9163279692-kube-api-access-6v5gl\") pod \"collect-profiles-29412840-6gtsd\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.472333 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:00 crc kubenswrapper[4666]: W1203 14:00:00.931115 4666 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f08e235_f8c5_4d96_a9e7_1f9163279692.slice/crio-922970e72b6b325bd2526e0db4390760691c775e50d6c01fbb5f15aac521275e WatchSource:0}: Error finding container 922970e72b6b325bd2526e0db4390760691c775e50d6c01fbb5f15aac521275e: Status 404 returned error can't find the container with id 922970e72b6b325bd2526e0db4390760691c775e50d6c01fbb5f15aac521275e Dec 03 14:00:00 crc kubenswrapper[4666]: I1203 14:00:00.932974 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd"] Dec 03 14:00:01 crc kubenswrapper[4666]: I1203 14:00:01.458494 4666 generic.go:334] "Generic (PLEG): container finished" podID="5f08e235-f8c5-4d96-a9e7-1f9163279692" containerID="5dc13f6d2118d828609015c9bcb19416257ec57490e60b6bb1f22722fedad71b" exitCode=0 Dec 03 14:00:01 crc kubenswrapper[4666]: I1203 14:00:01.458557 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" event={"ID":"5f08e235-f8c5-4d96-a9e7-1f9163279692","Type":"ContainerDied","Data":"5dc13f6d2118d828609015c9bcb19416257ec57490e60b6bb1f22722fedad71b"} Dec 03 14:00:01 crc kubenswrapper[4666]: I1203 14:00:01.458967 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" event={"ID":"5f08e235-f8c5-4d96-a9e7-1f9163279692","Type":"ContainerStarted","Data":"922970e72b6b325bd2526e0db4390760691c775e50d6c01fbb5f15aac521275e"} Dec 03 14:00:02 crc kubenswrapper[4666]: I1203 14:00:02.843631 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:02 crc kubenswrapper[4666]: I1203 14:00:02.904773 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v5gl\" (UniqueName: \"kubernetes.io/projected/5f08e235-f8c5-4d96-a9e7-1f9163279692-kube-api-access-6v5gl\") pod \"5f08e235-f8c5-4d96-a9e7-1f9163279692\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " Dec 03 14:00:02 crc kubenswrapper[4666]: I1203 14:00:02.904872 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f08e235-f8c5-4d96-a9e7-1f9163279692-secret-volume\") pod \"5f08e235-f8c5-4d96-a9e7-1f9163279692\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " Dec 03 14:00:02 crc kubenswrapper[4666]: I1203 14:00:02.905031 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f08e235-f8c5-4d96-a9e7-1f9163279692-config-volume\") pod \"5f08e235-f8c5-4d96-a9e7-1f9163279692\" (UID: \"5f08e235-f8c5-4d96-a9e7-1f9163279692\") " Dec 03 14:00:02 crc kubenswrapper[4666]: I1203 14:00:02.905892 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f08e235-f8c5-4d96-a9e7-1f9163279692-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f08e235-f8c5-4d96-a9e7-1f9163279692" (UID: "5f08e235-f8c5-4d96-a9e7-1f9163279692"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:00:02 crc kubenswrapper[4666]: I1203 14:00:02.912797 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f08e235-f8c5-4d96-a9e7-1f9163279692-kube-api-access-6v5gl" (OuterVolumeSpecName: "kube-api-access-6v5gl") pod "5f08e235-f8c5-4d96-a9e7-1f9163279692" (UID: "5f08e235-f8c5-4d96-a9e7-1f9163279692"). InnerVolumeSpecName "kube-api-access-6v5gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:00:02 crc kubenswrapper[4666]: I1203 14:00:02.913046 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f08e235-f8c5-4d96-a9e7-1f9163279692-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f08e235-f8c5-4d96-a9e7-1f9163279692" (UID: "5f08e235-f8c5-4d96-a9e7-1f9163279692"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.007189 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v5gl\" (UniqueName: \"kubernetes.io/projected/5f08e235-f8c5-4d96-a9e7-1f9163279692-kube-api-access-6v5gl\") on node \"crc\" DevicePath \"\"" Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.007220 4666 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f08e235-f8c5-4d96-a9e7-1f9163279692-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.007229 4666 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f08e235-f8c5-4d96-a9e7-1f9163279692-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.481994 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" event={"ID":"5f08e235-f8c5-4d96-a9e7-1f9163279692","Type":"ContainerDied","Data":"922970e72b6b325bd2526e0db4390760691c775e50d6c01fbb5f15aac521275e"} Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.482033 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-6gtsd" Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.482036 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="922970e72b6b325bd2526e0db4390760691c775e50d6c01fbb5f15aac521275e" Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.922625 4666 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5"] Dec 03 14:00:03 crc kubenswrapper[4666]: I1203 14:00:03.929650 4666 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412795-b6dh5"] Dec 03 14:00:05 crc kubenswrapper[4666]: I1203 14:00:05.435416 4666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88615659-1f7a-4d7e-ba15-d3f89534b454" path="/var/lib/kubelet/pods/88615659-1f7a-4d7e-ba15-d3f89534b454/volumes" Dec 03 14:00:54 crc kubenswrapper[4666]: I1203 14:00:54.775522 4666 scope.go:117] "RemoveContainer" containerID="0e2526aa90c73db388436fc1750c689248ccad689ea2d8c23cd1aada98be820b" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.147457 4666 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412841-qz6tz"] Dec 03 14:01:00 crc kubenswrapper[4666]: E1203 14:01:00.148529 4666 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f08e235-f8c5-4d96-a9e7-1f9163279692" containerName="collect-profiles" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.148546 4666 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f08e235-f8c5-4d96-a9e7-1f9163279692" containerName="collect-profiles" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.148786 4666 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f08e235-f8c5-4d96-a9e7-1f9163279692" containerName="collect-profiles" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.149554 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.164058 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412841-qz6tz"] Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.302018 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-combined-ca-bundle\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.302395 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-config-data\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.302549 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9tp\" (UniqueName: \"kubernetes.io/projected/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-kube-api-access-4g9tp\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.302742 4666 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-fernet-keys\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.405018 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-config-data\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.405377 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9tp\" (UniqueName: \"kubernetes.io/projected/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-kube-api-access-4g9tp\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.405495 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-fernet-keys\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.405621 4666 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-combined-ca-bundle\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.412678 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-fernet-keys\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.413494 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-config-data\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.423471 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-combined-ca-bundle\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.424111 4666 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9tp\" (UniqueName: \"kubernetes.io/projected/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-kube-api-access-4g9tp\") pod \"keystone-cron-29412841-qz6tz\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.489618 4666 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:00 crc kubenswrapper[4666]: I1203 14:01:00.921936 4666 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412841-qz6tz"] Dec 03 14:01:01 crc kubenswrapper[4666]: I1203 14:01:01.120057 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412841-qz6tz" event={"ID":"180fff92-3dba-43e1-8a4b-8cad1e8c0b50","Type":"ContainerStarted","Data":"1f17d556746d41351019cfd33cf1a3bbb501434404713c622a14a41a4b9800d9"} Dec 03 14:01:01 crc kubenswrapper[4666]: I1203 14:01:01.120372 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412841-qz6tz" event={"ID":"180fff92-3dba-43e1-8a4b-8cad1e8c0b50","Type":"ContainerStarted","Data":"37293cf8b00cbe02d66f26ef8fe2d46afe7bdf2f0cd9ac7ddab2e11e154e1d31"} Dec 03 14:01:01 crc kubenswrapper[4666]: I1203 14:01:01.135896 4666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412841-qz6tz" podStartSLOduration=1.135872941 podStartE2EDuration="1.135872941s" podCreationTimestamp="2025-12-03 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:01:01.134131915 +0000 UTC m=+6449.979092976" watchObservedRunningTime="2025-12-03 14:01:01.135872941 +0000 UTC m=+6449.980834002" Dec 03 14:01:04 crc kubenswrapper[4666]: I1203 14:01:04.163760 4666 generic.go:334] "Generic (PLEG): container finished" podID="180fff92-3dba-43e1-8a4b-8cad1e8c0b50" containerID="1f17d556746d41351019cfd33cf1a3bbb501434404713c622a14a41a4b9800d9" exitCode=0 Dec 03 14:01:04 crc kubenswrapper[4666]: I1203 14:01:04.163846 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412841-qz6tz" event={"ID":"180fff92-3dba-43e1-8a4b-8cad1e8c0b50","Type":"ContainerDied","Data":"1f17d556746d41351019cfd33cf1a3bbb501434404713c622a14a41a4b9800d9"} Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.526886 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.610779 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-config-data\") pod \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.610845 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-fernet-keys\") pod \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.610986 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-combined-ca-bundle\") pod \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.611189 4666 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9tp\" (UniqueName: \"kubernetes.io/projected/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-kube-api-access-4g9tp\") pod \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\" (UID: \"180fff92-3dba-43e1-8a4b-8cad1e8c0b50\") " Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.619216 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "180fff92-3dba-43e1-8a4b-8cad1e8c0b50" (UID: "180fff92-3dba-43e1-8a4b-8cad1e8c0b50"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.619318 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-kube-api-access-4g9tp" (OuterVolumeSpecName: "kube-api-access-4g9tp") pod "180fff92-3dba-43e1-8a4b-8cad1e8c0b50" (UID: "180fff92-3dba-43e1-8a4b-8cad1e8c0b50"). InnerVolumeSpecName "kube-api-access-4g9tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.653187 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "180fff92-3dba-43e1-8a4b-8cad1e8c0b50" (UID: "180fff92-3dba-43e1-8a4b-8cad1e8c0b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.679579 4666 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-config-data" (OuterVolumeSpecName: "config-data") pod "180fff92-3dba-43e1-8a4b-8cad1e8c0b50" (UID: "180fff92-3dba-43e1-8a4b-8cad1e8c0b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.713873 4666 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.713907 4666 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.713915 4666 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:01:05 crc kubenswrapper[4666]: I1203 14:01:05.713928 4666 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9tp\" (UniqueName: \"kubernetes.io/projected/180fff92-3dba-43e1-8a4b-8cad1e8c0b50-kube-api-access-4g9tp\") on node \"crc\" DevicePath \"\"" Dec 03 14:01:06 crc kubenswrapper[4666]: I1203 14:01:06.182558 4666 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412841-qz6tz" event={"ID":"180fff92-3dba-43e1-8a4b-8cad1e8c0b50","Type":"ContainerDied","Data":"37293cf8b00cbe02d66f26ef8fe2d46afe7bdf2f0cd9ac7ddab2e11e154e1d31"} Dec 03 14:01:06 crc kubenswrapper[4666]: I1203 14:01:06.182591 4666 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37293cf8b00cbe02d66f26ef8fe2d46afe7bdf2f0cd9ac7ddab2e11e154e1d31" Dec 03 14:01:06 crc kubenswrapper[4666]: I1203 14:01:06.182601 4666 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412841-qz6tz" Dec 03 14:01:09 crc kubenswrapper[4666]: I1203 14:01:09.866598 4666 patch_prober.go:28] interesting pod/machine-config-daemon-q9g72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:01:09 crc kubenswrapper[4666]: I1203 14:01:09.867370 4666 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q9g72" podUID="782e76d3-8dbe-4c2e-952c-6a966e2c06a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"